Skip navigation
All Places > Snow Product Hub > General Licensing Forums > Blog
1 2 3 Previous Next

General Licensing Forums

79 posts

Snow has always had two places to store the cost of an application and they are traditionally used for different reasons. The behaviour of this functionality recently changed – this article explains the current functionality and the recent change.

 

  1. Application Cost

    This is the cost to be used in any Risk, or Cost saving reports and the financial info tab for a computer (used in cost per business area / TCO type reports). It is entered by editing the application, and is easy to bulk upload prices for many applications as desired.

  2. License Cost

    This will be the Purchase price of the license divided by the number of actual licenses – in the example below the cost is £10 per installation.

  3. Financial Info

    This tab shows the cost of the Hardware and the application costs of any software on that hardware (see below)


    These values can be reported in the 'Potential Cost Savings' report, and also the 'Cost of Unused Applications per Computer' report.



    It is important to note that in versions of SLM between 8 and 9.1, if there was a license cost AND an Application cost for an application, the license cost would always win. This had one unfortunate side effect: If the license cost was Zero, it would override any application cost – showing risks as zero, and savings as Zero (if you had a license entered).

    As of SLM 9.2 there has been a change:

    If the license cost is zero, Snow uses the application cost. If the license cost has a value, Snow will use that value an ignore Application cost. In the screenshot below Application cost is £140, license cost is zero.



    In the screenshot below license cost per install is £10 and application cost is still £140



This means in the Family View Risk is shown at the license value - £10



Where as without a license cost, the Risk is shown using the application cost ONLY


Note: Snow will average the license cost per installation if there are many licenses but with different prices.
In the example below we have two licenses at different prices (£300 & £500)

meaning the License average is £400 per install in this case.

Background

You want to collect additional data that can’t be collected using the standard scanning capabilities of the Snow agent.

 

Previous Solution:

With Snow Inventory version 3 and version 5, Snow offered the possibility in the Windows client/agent to run PowerShell scripts to capture additional data.

These scripts were executed exclusively by our agent or client. Platforms like Linux, Unix or MAC were not covered by this PowerShell solution.

 

New Solution:

However, Snow Inventory 6 enables additional data to be captured independently of the platform it is executed from. This new capability is called “Snow Dynamic Inventory”

How does it work?

Since this new functionality is only available on Snow Inventory Version 6 and higher, you’ll need to upgrade to this version to unlock the potential. Besides that, you also need Snow Agent version 6 running on each of the platforms in your IT estate.

 

Step 1

Create your own script to capture your specific data. The script language does not matter; use the scripting language you know best.

Guidelines for your own scripts:

  • The script name cannot be longer than 100 characters.
  • No space in the filename.
  • The Script itself can be placed anywhere

 

Naming Examples:

Scan-FilevaultEnabled.sh

Scan-BitLockerStatus.ps1

Scan-Whatever.bat

  • The data that you want to collect has to be in valid json format and Base64 encoded.
  • Output the result to the designated output folders with the script name as the destination

 

 

 

Output pathOS

/var/tmp/SnowSoftware/Inventory/Agent/script-output/ + ScriptName/

Unix

/Users/Shared/SnowSoftware/Inventory/Agent/script-output/ + ScriptName/

MacOS

%ProgramData%\SnowSoftware\Inventory\Agent\script-output\ + ScriptName\

Windows

/var/run/SnowSoftware/Inventory/Agent/script-output/ + ScriptName/

Linux

 

 

Step 2

Create and maintain a log file in the output folder named ScriptName.log

This handles the lifecycle of the logfile, i.e. deletion and/or overwrites.

Step 3

Create a metadata file along with the script output called metadata.json

 

The following tags must be included

 

Name
ScriptNameused by the server to separate json blobs in the database table
ScriptRunUTCTimeStamptime in UTC when the script run
DaysToRetainused by the server to determine how many days the data should be kept in the database before getting cleaned out by the Garbage Collector

 

The Server will be parsing this information and use it when storing the data in the database.

 

Example of metadata.json file

 

 

Step 4

Schedule your own script to run!

Note:

Step 1 – Step 3 is included in the main script.

All output data is base64 encrypted

After running your default snow Agent scan, all output files are collected and deleted.

 

You can find your collected data in the inventory Database:

 

 

Example for Shell Script:

This page is still under construction. Please visit later. Thank you for your patience.

Step 4: Reconciliation

The goal of reconciliation is primarily to harmonize contract, purchase and entitlement information with normalized inventory data to establish an ELP - the balance of licenses purchased to licenses consumed. ELP forms the basis of compliance, risk-reduction, audit defense, contract (re)negotiations, license "true ups" and optimizing software spend. Given the variety of SAM stakeholders, to accomplish these tasks, reconciliation merges normalized asset data with related information from other, often external, sources.

 

Reconciliation - The questions you need to ask yourself

  • How/where do you maintain entitlement data for key publishers?
  • What are the sources of entitlement data?
  • How much time do you spend weekly/monthly/yearly on reconciling entitlements to inventory?
  • What processes are triggered when a shortfall or over-license situation is discovered?
  • What processes do you have which are fed by reconciled data?

 

Reconciliation holds together Contract, Purchase and Entitlement Information with normalized inventory data to establish an effective license position, which is the balance of purchased vs. consumed software. In practice, reconciliation requires you to add license information into a Software Asset Management (SAM) solution and then assign it to users, machines or organizational units, depending on the metric of the application.

 

License information is extensive, including the number of licenses purchased, license cost, additional use rights, maintenance and support contracts and who or what the license is assigned to. The accuracy of this process forms the basis of compliance, risk reduction, audit defense, contract negotiations, license true-ups and optimizing software spend. Due to the various stakeholders within the Software Asset Management process, to accomplish the previous tasks, reconciliation combines normalized inventory data with related information from other data sources. These sources can include procurement data, license and entitlement details and information about users and organizational structures from Active Directory.

 

Reconciled data is very useful for planning, modelling and dependency mapping, which are critical for Software Asset Management and License Optimization. Reconciliation is also very useful for other IT groups such as Service and Support, who may be developing CMDBs which contain configuration data or ticketing information. It is always highly recommended that any organization regularly performs reconciliation tasks to understand their current compliance position. This proactive approach will help minimize the risk of exposure in the event of a vendor audit.

 

The Snow Way

Within Snow License Manager companies can manage all crucial licensing information within their estate. Once licensing information is entered into Snow License Manager, it will calculate a compliance position based on the normalized inventory data provided before. The result will be a compliance position for all software applications across the estate, giving involved stakeholders action steps to either reduce over-licensing or to prevent audit fines in case of under-licensing.

 

            

Step 2: Inventory

The purpose of inventory is to capture platform configuration information and to extract a complete list of all its software.

Inventory - The questions you need to ask yourself

  • How do you compile discovery data?
  • What reporting tools do you have?
  • What confidence level do you have in the coverage and accuracy of your Inventory data?
  • If you're using SCCM, do you have software inventory enabled? What about metering? 
  • Do you review your inventory data? How often? 
  • What processes do you have which are fed by inventory data?

 

The goal of the inventory stage is to identify all the software installed or executed on every platform in your network - and how multiplatform technology can help you present this information back in one single place. The identification of these software installations is foundational to any software asset management solution.
Understanding the various ways software publishers license their software, is critical in developing an effective SAM practice. A complete inventory solution should help identify the various metrics that drive license requirements. Inventory of hardware configuration is also important to SAM as many licenses define specific hardware configurations such as CPU & Core. Some inventory technologies also gather detailed software usage metrics, which can help business decisions about deployed software and are often required to optimize software spending.
Usage is also critical to identify software that isn't installed on user devices such as web applications, Software as a service (SAAS) applications or Virtual Desktop Infrastructure (VDI). This inventory stage helps customers to have a clear view of all their software investments across their network, removing risks associated with IT blind spots and preventing necessary IT spend.

 

Inventory using Snow

 

Of course it is possible to inventory by using our tools. We provide the Snow Inventory Agent for several operation systems such as Windows, Linux and Unix, and macOS. We do also provide a solution to scan Oracle databases with our Snow Inventory Oracle Scanner.

 

If you use an alternative solution to inventory your environment, our Snow Integration Manager may be your choice - by offering various Snow Connectors for 3rd Party Inventory Sources, we are able to join the retrieved data into Snow License Manager.

 

The following third party integration connectors are available out-of-the-box:

 

  • BMC Atrium Discovery and Dependency Mapping (ADDM)
  • BMC Discovery
  • Dell KACE
  • Discovery data from file
  • FrontRange Discovery
  • HEAT Discovery
  • HP Discovery and Dependency Mapping Inventory (DDMI)
  • HP Server Automation
  • HP Universal Discovery
  • IBM BigFix Inventory
  • IBM License Metric Tool
  • IBM Tivoli Asset Discovery for Distributed (TAD4D)
  • iQuate iQSonar
  • Ivanti Endpoint Manager
  • LANDesk
  • Microsoft Intune
  • Microsoft Intune (SCCM Hybrid)
  • Microsoft System Center Configuration Manager (SCCM)
  • Miss Marple
  • MobileIron
  • ServiceNow Discovery
  • Snow XML
  • Symantec Altiris (7.x - 8.x)
  • VMware
  • AirWatch

Note that this list might be updated in the future.

 

           

Step 3: Normalization

Enterprises often have multiple discovery and inventory solutions. Normalization is the consolidation of discovered inventory datasets to remove duplicated or conflicting information.

 

Normalization - The questions you need to ask yourself

  • How do you normalize your inventory data today? What inventory data is included?
  • Do you maintain your own catalog? How is it updated?
  • To what level is data normalized (publishers, title, edition, version, release)
  • What other tools (i.e. ITSM, CMDB) do you populate with normalized data? 
  • What processes do you have which are fed by normalized data?

 

Data can be extracted from many different sources which means it will not be consistent. The process of Normalization presents it in a friendly and easily recognizable format. It removes duplicates to present just one source of truth about each software asset.

 

The primary benefit of a normalization process is an accurate organized inventory across different datasets. For example: One dataset recorded an executable file name as ‘application.exe’, whereas the other dataset recorded it as ‘application’. Even though it is only one licensed application, it has two unique descriptions, but should not be counted as two separate installations.

 

Many approaches to normalization classify and categorize inventory automatically using databases of vendors, product and service names to standardize naming conventions of discovered inventory. Another good example of normalization is bundling identification. The inventory process might have discovered multiple applications including Microsoft Word, Microsoft Outlook and Microsoft PowerPoint installed on a Laptop. It determined that these are not three independent licensable applications, but instead a single suite.

 

Some normalization solutions may also provide default metric information, group applications into application families, add upgrade and downgrade paths, release dates and end of service information.

 

The Snow Way

Snow Software addresses these needs with its offering of the Data Intelligence Service (DIS), formerly known as the Software Recognition Service (SRS). It is available as a subscription service and provides our customers with up-to-date information for more than 500.000 applications from 80.000 vendors. If you would like to know more about our DIS/SRS, visit this blog post.

 

            

Step 1: Discovery

Discovery is the act of interrogating TCP/IP networks to identify network-attached physical and virtualized platforms upon which software executes.

Discover all platforms across the network and categorize these into those which have the potential to run Enterprise Software (i.e Oracle) and those which do not with minimal impact.

This is to ascertain potential financial exposure from licensing requirements.

 

Discovery - The questions you need to ask yourself

  • What tools do you use to monitor your environment?
  • Are there platforms for which you don’t have discovery? (MAC, Virtual, Cloud, UNIX/Linux)
  • What percentage of your estate is covered by discovery tools?
  • What processes do you have which are fed by discovery data?

 

Discovery is the process of finding and identifying all platforms on which software resides. This first step is essential in developing a complete Software Asset Management strategy.
As the use of technology has evolved so has the requirement in the inventory of new technologies - often virtual or off network like Software as a service (SAAS), Infrastructure as a service (IAAS) and mobile devices. Organizations will see more of their technologies spent in cloud services and mobile assets. It is specially important as many cloud and mobile technologies are onboarded with little or no involvement of IT which results in lack of visibility making it increasingly difficult for SAM and security teams to fulfill their roles. It is critical to have an internal solution that provides appropriate discovery capabilities for the different technologies used within the environment. Essential for effective SAM is the ability to discover every asset
in the estate, that consumes software. A good inventory solution provides complete asset discovery as well as revealing the blind spots in the IT network. This means computers, servers, mobile devices, tablets, all connected network devices like routers printers and firewalls. Many customers invested in these capabilities like SCCM. It is critical that you have 100% visibility into the entire estate.

 

The Snow Way

With Snow Inventory, Computers and devices can be discovered using LDAP lookups in an Active Directory, or by using the following technologies for network discovery on specific IP address ranges:

  • SNMP (SNMPv1)
  • SSH
  • WinRPC/WMI
  • ICMP (“ping”)
  • TCP/IP fingerprinting
  • DNS lookup
  • NIC manufacturer lookup.

When TCP/IP fingerprinting is enabled, discovery will attempt to identify the type of OS installed on the device.For details about the discovery criteria and the columns included in each discovery view, see the following tables.

 

It is also possible to configure the Snow Inventory server to perform an Active Directory discovery using LDAP or secureLDAP for both devices and users. Multiple LDAP paths can be configured (for i.e. different subnetworks).

 

You can then check the result of there in the built-in report in Snow License Manager named "Discovered assets that are not inventoried". In the column "Source" you will be able to see the origin of the discovered asset.

 

           

 

DINROS - Create your SAM roadmap

The purpose of the framework DINROS is to understand current SAM tool capabilities, using that baseline to plan how and where to make improvements within six critical SAM activities.

 


DINROS - The six SAM activities

 

  1. Discovery

    Discovery is the act of interrogating TCP/IP networks to identify network-attached physical and virtualized platforms upon which software executes. Discover all platforms across the network and categorize these into those which have the potential to run Enterprise Software (i.e Oracle) and those which do not with minimal impact.

    This is to ascertain potential financial exposure from licensing requirements.

  2. Inventory

    The purpose of inventory is to capture platform configuration information and to extract a complete list of all its software.

  3. Normalization
    Enterprises often have multiple discovery and inventory solutions. Normalization is the consolidation of discovered inventory datasets to remove duplicated or conflicting information.

  4. Reconciliation
    The goal of reconciliation is primarily to harmonize contract, purchase and entitlement information with normalized inventory data to establish an ELP - the balance of licenses purchased to licenses consumed. ELP forms the basis of compliance, risk-reduction, audit defense, contract (re)negotiations, license "true ups" and optimizing software spend. Given the variety of SAM stakeholders, to accomplish these tasks, reconciliation merges normalized asset data with related information from other, often external, sources.

  5. Optimization
    The goals of license optimization are to optimize software spending, keep track of changing software license structures and better manage vendors. Optimizing license position means being able to reduce the number, type and expense of licenses needed and in use, as appropriate.

  6. Sharing

    SAM tools both consume and produce data and information, which they must share to be useful. Gathering technical, financial and contractual data in a central system of record for IT assets enables I&O leaders to manage vendors effectively, and software assets from requisition through retirement. Implicit is the need for an integrated workflow and data persistence, driving the need for a centralized information repository in which to store and make available the results of activities. 

 

           

 

Are you new to Snow? Be sure to check out the landing page:

Welcome to Snow – Where to start and things to know

Dear SnowGlobe users,

our Knowledge Management team has activated a brand new page, where you can track all the Knowledge Base Articles you've subscribed.

 

Be sure to be logged in into the Support portal and then click here:  Notifications and Subscriptions

In this page you can see at a glance all the KB Articles, you can also unsubscribe from them or edit the notification settings.

 

To unsubscribe from a KB Article, click on "Subscribed":

 

 

To edit the notifications, click on "Notification Preferences":

 

 

 

Additional features will come later.

This mini-portal is very useful and we hope you like it!

New report: Files per computer

 

We’ve al been in the situation where we need to understand the reason for a certain application to be triggered in SLM. To be able to figure out what triggered this application, you could either edit a computer to check this on machine level or edit an application to see what triggered this application on all computers. This was only available to users with edit permissions on computer/application level.

 

To be able to provide a list with the combination of computers/applications this resulted a lot in creating custom reports for customers, this is the reason why we’ve introduced a new standard report in SLM 9.2 that brings this functionality. Let’s provide some more insight into this new report called "Files per computer". This report can be found in the “Standard reports” group.

With this report it makes finding all the relevant information like the path of a file much easier, which will make validating found software also a lot better. Unassigned software still needs to be reviewed in the SMACC, this information is not present in the “Files per Computer” report, however it does show the “hidden” applications that are used for recognition by SRS.

 

Please always make sure to add search criteria before hitting “Show report” since there is a lot of data to be loaded. However, if you forget to add search criteria, only the first 5 machines will be shown by default to prevent unnecessary load.

 

Let’s use Java as an example! Java is currently a hot topic (https://community.snowsoftware.com/message/11636-java-licensing-changes) due to the changes in licensing that Oracle implemented starting January 2019. If you make use of the Oracle Java versions you need to identify the specific version and if it was part of an installed third-party application, to determine if this needs to be licensed with Oracle. Therefore it’s good to have an overview of machines with Java and the executable path in one single report.

Using this “Files per Computer”, I’m now able to find all machines that have a java.exe installed and I can easily see in the path information if this has been part of a third-party application.

Since the beginning of time, Windows operating systems have had a major version and edition, and then various updates. Originally Service Packs, but since Windows 10/2016 its been version numbers.

Now Snow always used to report on Service Packs, but since version numbers have been used, Snow Inventory would pick these up, but there was no report in Snow License Manager. This is now fixed - you can access version numbers in the Operating Systems Report:

And also in the computers tab within the OS Application

So why is this interesting? Older Windows 10 and 2016 builds are going End of Service, and as such need to be updated - 1607 and earlier are already End of Service.

ana.marquez

Compliance exclusions

Posted by ana.marquez Employee Apr 29, 2019

This is one of four blog posts related to features in SLM 9.2.0 release.

A capability for all SAM programs is being able to provide an accurate license position and financial risk exposure to the organisation for key software manufacturers. This is complicated in general, however a crucial challenge is being able to account for the exceptions and exclusions that many manufactures have within their licensing rules. This results in the need to manually adjust compliance figures each time the organisation needs to know what their license position is, and the risk associated. This quickly becomes almost unmanageable in large organisations. Even just documenting and calculating which computers are being excluded and why is a huge task.

 

The SAM development team at Snow have been investigating how we can help resolve this, engaging with SAM specialists from around the globe, coming to a solution that allows our users to manage these scenarios with ease. The new feature, compliance exclusions, allows a SAM manager to document these license scenarios and tell the Snow platform which applications should be excluded, from what computers. This allows Snow to remove the license requirement from computers with those applications, impacting compliance and therefore the risk and cost reported.

This is done through the creation of exclusion rules, where a SAM manager can either manually or dynamically select which applications and computers should be included in the rule, as well as a name and description of which licensing rule this is related to. They can also see which platform user created the rule, when and who last updated it, and detailed reporting on the results of all of the rules that have been added.

 

To give you a taste, below is a detailed example of how this feature can be used to help solve Microsoft MSDN licensing. As always, get in touch or comment on this post with questions.

 

In many organisations purchasing Microsoft software, a number of employees and their devices are licensed for non-production use through MSDN licensing. In some cases, the SAM team might not need to track the spend on MSDN licensing, however they do need to remove the license requirements created by computers and servers dedicated to non-production use from the rest of their Microsoft licensing.

To account for this scenario in our compliance calculation, see the example below using the exclusions feature:

 

Navigate to the Administration section and then select Compliance Exclusions

Select "Add exclusion":

Enter a name and description - in this example we enter in some information related to how we are using this rule to help with MSDN licensing:

Select applications to be included in this exclusion rule either manually or by adding a dynamic criteria - in this case we add all of the applications that are created by Microsoft and filtered out any Office 365 applications to keep the example simple - however you could target specific application families, or specific apps, or even exclude some applications. With this criteria it means that any new Microsoft application that gets released will be included automatically:

Select computers to be included in this exclusion rule either manually or by adding a dynamic criteria - in this case we target all computer whose hostname contains the naming convention "dev", allowing us to automatically apply this to new hosts that are added with that convention. Whilst this is a simple example, the complexity of the criteria can match the complexity of your environment:

View a summary of all the computers and applications that will now have their license requirement removed when the rule is active is shown:

When hitting save we are prompted to active the rule if we are ready for it to impact compliance:

Now the rule is active, you can see how it is impacting compliance on the computer details page of computer with excluded applications:

And see its impact on the overall compliance of one of the excluded applications on the applications detail page:

 

As always, if you have any questions feel free to post in the comments section or reach out to me directly.

I've had recently a couple of requests about Security information in Snow.

After providing the specific user guides and getting a good feedback about their content (no doubts about it! ), I thought they could be useful for the SnowGlobe community too.

Below you can find the direct permalinks, so you can reach the documents easily without searching in the full Knowledge Base.

 

Technical Description: Security Considerations in Snow License Manager 9

 

Technical Description: Security Considerations for Snow Analytics

 

Technical Description: Security in Snow Inventory for Snow Inventory 5

 

Technical Description: Security Considerations for Snow Inventory 6

 

User Guide: Federated authentication with SAML for Snow License Manager 9

 

User Guide: Federated Authentication with SAML - Update revision 8.3

 

I hope you find them useful!

 

Note: in each KB Article page, you can click on the "Subscribe" button (on the top right), to get a notification every time the User Guide is updated.

aaron.fryer

Snow Agent Monitoring

Posted by aaron.fryer Employee Apr 6, 2019

Snow Agent Monitoring

_______________________________________________________________

DOCUMENT DATE            21/03/2019
AUTHOR                              AARON FRYER


 

INTRODUCTION
This document will provide some best practice processes and tips to monitor the behaviour of the Snow Agent.


 

REQUIREMENTS

- Snow Inventory 5.x / 6.x

- Procdump tool
Download here -
https://download.sysinternals.com/files/Procdump.zip

- Windows Debugging tool
https://developer.microsoft.com/windows/downloads/windows-10-sdk


 

Using the procdump tool

 

- Download and install the procdump program
- Extract the contents to C:\Temp
- Run command prompt as an administrator
- Browse to the directory C:\Temp via command prompt

Collect a memory dump file based on high CPU usage

The "-c [%value]" argument specifies the CPU usage threshold after which a memory dump file should be collected.

The "-c" argument is usually combined with the following ones:

"-s [secs]" – specifies how long the CPU usage should remain at a specified level for a memory dump file to be collected. The default value is 10 seconds.

"-u" – specifies the CPU usage of any particular processor core to be tracked, comparing the average CPU usage of all the cores.

 

 

 

Collect a memory dump file based on high memory usage

The "-m [MBs]" argument can be used to specify memory usage threshold after which a memory dump file should be collected.

Example. Collect a memory dump file if the process memory usage is larger than 500 MB.

Once there is a successful dump of the application that’s being monitored, you’ll find a .dmp file placed in the directory.



To view the dump file, you’ll require the Windows Debugging tool


 

Snow Agent Logging

The snow agent logs can be viewed remotely from the Snow Inventory console using the below method in the SMACC.

- Browse to device
- Double click the device to open the settings
- Browse to log file in the navigation pane





 

CHANGING THE LOG LEVEL OF THE AGENT

It can be useful to change the log level of the agent to give you more visibility on the errors you may be facing when troubleshooting the snow agent.

To do this, you’ll need to browse to the directory C:\Program Files\Snow Software\Inventory\Agent and open the snowagent.config file

It’s recommended to use notepad++ when opening this file to make editing easier and more clear.

On line 72 change the log level to verbose and on line 73 the MaxSize setting to 10000. Below is an example of what you should see in the snowagent.config file once these changes have been made. Once done, save and close the file and stop and start the Snow Inventory Agent service.

  <Logging>

    <Level>verbose</Level>

    <MaxSize>10000</MaxSize>

  </Logging>

Snow Active Directory Discovery
_______________________________________________________________

DOCUMENT DATE            14/03/2019
AUTHOR                              AARON FRYER

INTRODUCTION
This document will walk you through how to enable Active Directory Discovery for Snow Inventory.


 

REQUIREMENTS

- Snow Inventory 5.x / 6.x

- An Active Directory user account which is a member of the domain users group.


 

ENABLING ACTIVE DIRECTORY DISCOVERY

Login to the Snow Inventory Server and open the SMACC Console. Browse to Inventory Servers and click the show details option to be directed to the next page.




 


On this page, you’ll need to click edit





You’ll now be presented with this menu, tick the box to enable Active Directory Discovery, this setting will start searching for machines in your domain once enabled and saved.


 

Click the Add option and you’ll see a list of options as below. Fill out the options below and click add and then save.




 

To enable User Discovery options, complete the same process and hit save once done.