Category: ‘Service Manager’

Update Rollup 7 released – includes the write collision fix!

July 29, 2015 Posted by Anders Asp

Today is a big day! Not only was Windows 10 released today but also Update Rollup 7 for SCSM. Why is UR7 such a big deal? Well, among other things, it contains a fix for the write collision issue that’s been in the product since day 1! (See this post for more information about the “error”:


After applying UR7, you shouldn’t be getting the error above if the WI you’re working with is updated by someone or something else (such as a workflow) while you still haven’t saved your changes. Instead your update will be merged with the other update.

Example: If User A and User B works with the same Incident, let’s call it IR1234, and User A updates the Title while User B still have the Incident form opened. User A presses OK and moves along to the next case. User B has now updated the same Incident, IR1234, with a new Description. Remember, the Title change that User A did is not yet reflected on the form, since User B had the form opened when User A updated this! Previously when User B pressed OK or Apply the error above would be thrown, but now with UR7 applied the update will be commited, and the form would be updated to also reflect the update to Title that was commited by User A. To highlight that this merge of data has been done, the form will display a piece of information in the top of the form, just like this.


The only exception to this would be if User A and User B actually updates the exact same property. In that case, the old data collision error will still occur. To get more information about write collision fix, see this official blogpost:

To get UR 7 and to read about all the fixes, go here:


How to run Active Directory cmdlets in Orchestrator .Net Activity

March 13, 2015 Posted by Alexander Axberg

As you might have noticed, Orchestrator can be a bit grumpy when it comes to running Powershell scripts. Some cmdlets will simply not load in a .Net Script Activity.
Running scripts in a .Net Script Activity is really nice, because we have the possibility to publish variables in the script directly to the data bus.
A way around this could be to run scripts remotely on another server or add: Powershell{ /script\ } around your script, but in both cases we loose the possibility to publish all variables to the data bus.


The Problem

Orchestrator runs all powershell scripts in powershell v2 and in 32-bit mode. We can simulate this by:

  1. Starting a Powershell (x86) console
  2. Loading version 2 by typing: powershell -version 2
  3. If we try to load the cmdlets: import-module activedirectory you will see it will fail to load with some strange errors


The Cause

Active Directory cmdlets are compiled using the .Net4 assemblies, and poweshell v2 will only load .Net2 by default.
You can verify this by typing: [Environment]::version in your powershell console. “Major” is the version of the currently loaded .Net assembly.
If you compare on a regular x64 powershell console, and in a x86 v2 console like the one we started in the section above, you will see the difference.


The Solution

Add the following REG_DWORD registry key on the Orchestrator server and set the value to 1, to make Powershell (x86) always load the latest .Net assemblies :


New TechNet Gallery contribution – Affected Items Custom Control

March 8, 2015 Posted by Anders Asp

I just posted a new contribution to the TechNet Gallery. This time it’s a custom control called Affected Items which you can add to any of your Work Item forms to get a consistent way of adding Affected Items. The control also makes the relationship Required, which means that the Analyst will have to add at least one CI to the control in order to save the WI. Here’s some screenshots of how it can look when added to the form.



Service Request




For more information and to download the custom control, please head over to the TechNet Gallery at


Exchange Connector 3.1 released

December 12, 2014 Posted by Anders Asp

A new version of the Exchange Connector for Service Manager has been released. This updated version fixes these bugs:

  • Exchange Connector sometimes create multiple work items against a single email request.
  • Exchange connector gets stuck processing e-mails when large e-mails containing complex HTML formatting are processed.
  • Exchange connector fails to process the emails if it contains a ‘.msg’ file as attachment.
  • Exchange Connector sometimes consume very high memory (several GBs) when updating Work Items.
  • A new User CI is created in CMDB, if an external user (who is not in CMDB) is a To/CC recipient in the email which Exchange Connector processes. With this senders can accidently add new users in the CMDB, if they add any external users in the To/CC of an email which they are sending to the Exchange Connector monitored mailbox. This behavior is now configurable.
  • Voting for a Review Activity using the Exchange connector does not update the “VotedBy” field in the Review Activity.
  • Exchange Connector adds a new reviewer in the Review Activity when a user who is not in reviewer’s list sends an email to approve/decline that particular Activity.

See this link for more information:

I’ve not tried it myself yet, so if you try it out and discover anything unsual, please drop a comment below!


Useful PowerShell snippets – Get-UserByEmail

November 10, 2014 Posted by Anders Asp

Here’s another useful PowerShell snippet which I created when I built my PowerShell Exchange Connector for SCSM. The function itself will return the user object of the given e-mail address. If no user is found you also have to option to create the user object using the -CreateUser switch.

The function have the following parameters:

The e-mail address of the user who you want to retrieve

If this switch is enetered, the function will create a user object if no match is found in the CMDB

If you know the name of the user that you might create, you can specify the name of the object in this parameter

The script itself:

# This function will return the user object of the specified Email Address. If a matching user isn't found, the function can create an Internal user within the SCSM CMDB
# NOTE: SMlets must be loaded in order for the function to work

Function Get-UserByEmail {
    param (


    # Get all the classes and relationships
    $UserPreferenceClass = Get-SCSMClass System.UserPreference$
    $UserPrefRel = Get-SCSMRelationshipClass System.UserHasPreference$
    $ADUser = Get-SCSMClass Microsoft.Ad.User$
    # Check if the user exist
    $SMTPObj = Get-SCSMObject -Class $UserPreferenceClass -Filter "DisplayName -like '*SMTP'" | ?{$_.TargetAddress -eq $EmailAddress}

    If ($SMTPObj) {
        # A matching user exist, return the object

        # If, for some reason, several users are found, return the first one
        If ($SMTPObj.Count -gt 1) {$SMTPObj = $SMTPObj[0]}

        $RelObj = Get-SCSMRelationshipObject -TargetRelationship $UserPrefRel -TargetObject $SMTPObj
        Return $AffectedUser = Get-scsmobject -Id ($RelObj.SourceObject).Get_Id()

    } elseif ($CreateUser -and !$SMTPObj) {
        # A matching user does NOT exist. Do some processing to get the needed properties for creating the user object
        If (!$Name -or $Name -eq '') {
            $Name = $EmailAddress.Substring(0,$EmailAddress.IndexOf("@"))
            $UserName = $Name.Replace(",","")
            $UserName = $UserName.Replace(" ","")
        } else {
            $Name = $Name
            $UserName = $Name.Replace(",","")
            $UserName = $UserName.Replace(" ","")

        # Try Username to make sure we have a unique username
        $Loop = $TRUE
        $i = 1

        While ($Loop -eq $TRUE) {
            $tempUser = Get-SCSMObject -Class (Get-SCSMClass System.Domain.User$) -Filter "UserName -eq $UserName"

            If ($tempUser) {
                $UserName = $UserName + $i
                $i = $i +1
            } elseif ($i -gt 15) {
                Throw "Unable to find a unique username for the new user"
            } else {
                $Loop = $False

        # Create the Property Hash for the new user object
        $PropertyHash = @{"DisplayName" = $Name;
                            "Domain" = "SMINTERNAL";
                            "UserName" = $UserName;

        # Create the actual user object
        $AffectedUser = New-SCSMObject -Class (Get-SCSMClass System.Domain.User$) -PropertyHashtable $PropertyHash -PassThru

        # Add the SMTP notification address to the created user object

        If ($AffectedUser) {
            $NewGUID = ([guid]::NewGuid()).ToString()

            $DisplayName = $EmailAddress + "_SMTP"

            $Projection = @{__CLASS = "System.Domain.User";
                            __SEED = $AffectedUser;
                            Notification = @{__CLASS = "System.Notification.Endpoint";
                                             __OBJECT = @{Id = $NewGUID;
                                                          DisplayName = $DisplayName;
                                                          ChannelName = "SMTP";
                                                          TargetAddress = $EmailAddress;
                                                          Description = $EmailAddress;

            New-SCSMObjectProjection -Type "System.User.Preferences.Projection" -Projection $Projection


        # Return the created user object
        Return $AffectedUser



Using the function to retrieve a user:



Using the -CreateUser switch:




You can also download the whole script as a file here: Get-UserByEmail

Useful PowerShell snippets – Add-SRComment

November 1, 2014 Posted by Anders Asp

I will be posting a series of PowerShell snippets that you can use together with Service Manager. This first one is a function called “Add-SRComment” it is simply used whenever you would like to add an End User or Analyst comment to one of your Service Requests. I’ve been using this a lot when doing automated stuff in SCO and SMA lately.

The function have the following parameters:

Requires the actual SR object to which you would like to add the comment.

Requires the comment to add to the action log

Requires the name of the person who’s writing the comment

Switch to Analyst comment instead of End User comment

Switch to a Private comment instead of a public comment

The script itself:

# This function adds a comment to the SR Action Log
# NOTE: SMlets must be loaded in order for the function to work

Function Add-SRComment {
    param (

    # Make sure that the SR Object it passed to the function
    If ($SRObject.Id -ne $NULL) {

        If ($AnalystComment) {
            $CommentClass = "System.WorkItem.TroubleTicket.AnalystCommentLog"
            $CommentClassName = "AnalystCommentLog"
        } else {
            $CommentClass = "System.WorkItem.TroubleTicket.UserCommentLog"
            $CommentClassName = "EndUserCommentLog"

        # Generate a new GUID for the comment
        $NewGUID = ([guid]::NewGuid()).ToString()

        # Create the object projection with properties
        $Projection = @{__CLASS = "System.WorkItem.ServiceRequest";
                        __SEED = $SRObject;
                        EndUserCommentLog = @{__CLASS = $CommentClass;
                                            __OBJECT = @{Id = $NewGUID;
                                                        DisplayName = $NewGUID;
                                                        Comment = $Comment;
                                                        EnteredBy = $EnteredBy;
                                                        EnteredDate = (Get-Date).ToUniversalTime();
                                                        IsPrivate = $IsPrivate.ToBool();

        # Create the actual comment
        New-SCSMObjectProjection -Type "System.WorkItem.ServiceRequestProjection" -Projection $Projection
    } else {
        Throw "Invalid Service Request Object!"

Script syntax

You can also download the whole script as a file here: Add-SRComment

Creating stack panels from the Authoring Tool (and some XML editing)

September 26, 2014 Posted by Anders Asp

Anyone who’s done some kind of form editing with the Authoring Tool knows that we only have a small number of controls to use when creating our customizations. One particular control that I’ve been missing is the Stack Panel. You know, the container object in which you place other objects and in which the Stack Panel handles the placement for all sub-controls?

As it turns out, you can “create” the Stack Panel without using Visual Studio with a small XML modification.

This is how you would do it:

  1. Open the Authoring Tool and open the form you would like to add the Stack Panel to
  2. Add the control named Panel to the place where you would like to have your Stack Panel. Do not do any other modifications to this control at this time!

  3. Save the Management Pack and open it in an XML editor (I use Notepad++)
  4. Locate the Panel control (which actually is a Grid) that we added. This should be at the bottom of the <Customization> tag if you didn’t do any other form customizations after you added the control and should like similar to this:

    <AddControl Parent=”StackPanel205″ Assembly=”PresentationFramework, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35″ Type=”System.Windows.Controls.Grid” Left=”0″ Top=”0″ Right=”0″ Bottom=”0″ Row=”0″ Column=”0″ />
  5. To convert the Grid to a Stack Panel, simply change the word Grid in type, to StackPanel. In the example above the code would look like this after the change:

    <AddControl Parent=”StackPanel205″ Assembly=”PresentationFramework, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35″ Type=”System.Windows.Controls.StackPanel” Left=”0″ Top=”0″ Right=”0″ Bottom=”0″ Row=”0″ Column=”0″ />
  6. Save the file and reload (close and open) the MP in your Authoring Tool. The Panel should now be a Stack Panel and you can go ahead and do the rest of your customizations!

Use the Exchange connector for updates only

September 11, 2014 Posted by Anders Asp

The Exchange Connector is an essential part of almost every Service Manager installation. Some customers however, do not want to create new Incidents/Service Requests upon receving new e-mails, but would rather want the connector to handle updates only. This is not possible to configure in the connector itself which causes some people to think that this isn’t possible. But what can we do about this on the Exchange side?

Well, we know that the connector itself will create new work items if incoming e-mails are missing the work item id tag in the subject, such as [IR412] or [SR9122]. If the tag is present in the subject, the connector will update the matching Work Item with the information within the e-mail. So if we can block or reject any e-mails missing the ID tag, the connector would only receive updates, right?

To do this, we would have to create a new rule from the Exchange console. The example below is from my Exchange 2013 lab environment but the same rule is applicable to Office 365 as well.

  1. Open the Exchange Admin Center by using your browser to access https://<servername>/ecp
  2. Log in with an account that has Exchange rights and go to the mail flow tab
  3. Under rules, click the + sign to add a new rule
  4. Select Create a new rule…
  5. Give the rule a name, such as Exchange Connector – accept updates only
  6. Under Apply the rule if… select The Recipient… > address matches any of these text patterns and specify the e-mail address of your Exchange Connector
  7. Under Do the following… select Block the message… > reject the message and include an explanation and enter a message of your choice
  8. Under Except if… select The subject or body… > subject matches these text patterns and specify the text pattern exactly like this: \[\D\D\d*\]
    The text pattern will include any email containing the ID of the Work Item enclosed in square brackets, just like this [IR123]. (The D equals one alphabetic character, and the d equals one numeric character)This is what you should end up with:

When you’re done in Exchange, try to send some e-mails to your Exchange Connector e-mail address to verify function. Any e-mails missing the id tag should be rejected with a message, and updates should get through and picked up by the connector. When working as it should, implement this into your production environment – because you’re not testing new stuff directly into production, are you? 🙂


Had a couple of questions on how this could be done in Exchange 2003, see the picture below. Please note that the actual text pattern is a bit different!


How to add mail addresses to Data Warehouse

June 25, 2014 Posted by Alexander Axberg

In this post I will go through how to add your users mailaddresses to the Data Warehouse, to be able to display them i reports.
Since they are not transferred to the Data Warehouse by default, we have to build a new Data Warehouse Management Pack to be able to sync this information.

But first a quick look how the mailaddresses are stored.
They are not simply stored in a textstring directly on the user object as you might think. They are actually stored as a separate object in the class “System.Notification.Endpoint”. This is makes it able to create several addresses on the same user (SIP and SMTP).
A relation between this object and the user object is then created. The relation is called System.UserHasPreference.

So what we need to do is to define a dimension for the System.Notification.Endpoint, and include the attributes that store the actual mailaddress.
Then we also need a Relationship Fact between the Notification Endpoint dimension and the User dimension.

The code to create that looks like this:

      <Dimension ID="SubscriberAddressDim" Accessibility="Public" InferredDimension="true" Target="Notifications!System.Notification.Endpoint" HierarchySupport="Exact" Reconcile="true">
        <InclusionAttribute ID="TargetAddress" PropertyPath="$Context/Property&#91;Type='Notifications!System.Notification.Endpoint'&#93;/TargetAddress$" SlowlyChangingAttribute="false" />
        <InclusionAttribute ID="ChannelName" PropertyPath="$Context/Property&#91;Type='Notifications!System.Notification.Endpoint'&#93;/ChannelName$" SlowlyChangingAttribute="false" />
        <InclusionAttribute ID="Id" PropertyPath="$Context/Property&#91;Type='Notifications!System.Notification.Endpoint'&#93;/Id$" SlowlyChangingAttribute="false" />
      <RelationshipFact ID="HasPreferenceFact" Accessibility="Public" Domain="DWBase!Domain.ConfigurationManagement" TimeGrain="Daily" SourceType="System!System.Domain.User" SourceDimension="DWBase!UserDim">
        <Relationships RelationshipType="SupportingItem!System.UserHasPreference" TargetDimension="SubscriberAddressDim" />

So the complete steps to create our new Data Warehouse MP looks like this:

  • Create a new MP with the code above, or download the complete one below
  • Seal it, and import it into Service Manager as usual
  • Wait for the MPSyncjob in Data Warehouse to kick in (every hour) or start it manually. The MP will then be synced into DW.
  • Take a beer while you wait for the deployment in Data Warehouse.
  • When deployment is completed, log into the DWDataMart database in SQL, and look under views and you should have 2 new views there: SubscriberAddressDimvw, HasPreferenceFactvw
  • Now you are all set to start query the database in Reports to display the mailaddresses. You can use the following SQL query to list all your user object in DW with the columns: Username, Domain, Mailaddress

    Take in mind that after the Management Pack deployment is completed, it could take a while to populate the tables with the mail addresses.

    smtp.TargetAddress AS 'E-Mail'
    UserDimvw AS u
    INNER JOIN HasPreferenceFactvw AS hp
    ON u.UserDimKey = hp.UserDimKey
    INNER JOIN SubscriberAddressDimvw AS smtp
    ON hp.UserHasPreference_SubscriberAddressDimKey = smtp.SubscriberAddressDimKey
    smtp.ChannelName = 'SMTP'
    AND hp.DeletedDate IS NULL


    Moving SLOs from one environment to another? (Part 1)

    June 18, 2014 Posted by Anders Asp

    Service Manager is built around storing your configuration in Management Packs. This is a great solution when you’re working with several different environments and would like to move your configuration between these, such as a test and a production environment. Most of the configuration you do is stored in different Management Packs, while data is stored in the database.

    With this in mind, let’s take a closer look at how Service Level Objectives, SLOs, are constructed.


    As the picture above displays, the SLO itself is constructed of a Calendar, a Metric, a Queue and a specified Target Time. When creating a new Calendar or Metric, these will not be stored in a MP, instead they will only be created in the database. However, when you are creating the SLO itself, you are able to specify a MP to store it in, so the SLO itself should be stored in a MP, shouldn’t it? Unfortunately not!

    New SLO

    So what is really stored in the Mangement Pack specified if not the SLO itself?

    – SLO workflow group
    – SLO workflow target
    – SLO workflow: AddRelationship
    – SLO workflow: DeleteRelationship
    – SLO workflow: EndEvent
    – SLO workflow: StartEvent
    – SLO workflow: PauseEvent (disabled by default)
    – SLO workflow: ResumeEvent (disabled by default)

    In other words, parts of the SLO configuration and how it is calculated is stored in the MP (yes, the SLO is based upon a set of workflows), but not the actual SLA Configuration object.

    As a result of this, we are not able to copy SLOs from one environment to another by export/import of an MP since most of the configuration regarding your SLOs is stored in the database itself. If you try to do this, you will end with a number of “ghost workflows” – not visible in the console and related to an SLO that doesn’t exist.

    Here’s an example of that – in this first picture, you can see my existing SLOs in the system and all the workflows following a certain pattern (that applies to SLO workflows). Note how my SLOs is matching these workflows.

    Before MP import

    Below is the picture displaying the exact same thing after an import of a MP containing two other SLOs. Note that these SLOs are not visible in the console and are not functional at all, yet a number of workflows has been created within Service Manager (marked with red). These are the so called “ghost workflows”.

    After MP import

    These “ghost workflows” will not function and will throw errors in the Operations Manager event log on your Management Server, just like this:


    So to summarize: Do not try to export/import the MP containing SLOs to copy SLOs from one environment to another. Doing so will only result in a number of erroneous “ghost workflows” that might affect performance, stability and clog up your event logs with events.

    In the second part of this blogpost I will try to create a script or runbook that you can use to copy SLOs from one environment to another instead – stay tuned!