MonoPrice 3D Printer: My Review

This purchase was very painful.

Prior reviewers on the Monoprice site said this is a FlashForge Creator X model which on the surface appears so. However, examining the Motherboard mine was a FF CreatorBoard Rev ‘D’. This is important because if you want to upgrade the firmware to the very powerful and feature-rich Sailfish Firmware you cannot. Sailfish documentation indicates it is only compatible with versions E,F,&G – not ‘D’. I also learned from the Google group members that rev E,F & G is open sourced but not ‘D’.

Monoprice say to use ReplicatorG but that is no longer an active open source project as no one seems to be actively developing it. So you are stuck with! I wanted to use Sailfish and Octoprint but the machine is too limiting for that.

So all that aside, my first print was posted on the following Google+ link. I was fairly pleased with it considering all the flaws and it being my very first attempt.

So after soliciting feedback on Google+ I attempted to make modifications to the file but soon found that the nozzle was jammed and I needed to clear it. The user manual only referenced that if you have a jam you need to clear it but no further info about how to do so. So I was on my own.

Monoprice support never really revealed what type of printer this is. They did tell me it was a clone of a Makerbot but with different firmware. I was told not additional documentation was available.

After taking apart the entire extrusion assembly, clearing the nozzle and reassembly I wanted to make more adjustments to the file but since I was stuck on ye-old ReplicatorG and no ability to upgrade to Sailfish I could not.

I then decided to just keep making 20mm cubes until it came out correctly which none did. Furthermore, when you add the other nozzle then everything get stringy and turns into a mess.

So being completely new to this 3d printing thing I figured I would return it and get something that is more supported and open source. I want to have success with my prints so I can learn and not have so many limitations on software, and partially supported hardware. I just want to print and not mess too much with the machine.

Below are the pictures of the tests I conducted. Until I quit and returned it.

I am now waiting for the completely open-source Prusa RepRap.









My New Monoprice 3D Printer

In mid-May 2015, my Gmail\Inbox contained what seemed like a very good email coupon from Monoprice for their Dual Extrusion ABS/PLA/PVA 3D Printer.

I was also considering the Printerbot Metal Simple 3D Printer which I learned about from Scott Hansleman’s blog and podcast and in all likelihood would have gone with it because the $599 price-point and good reviews from Hansleman.

However, after receiving that Monoprice coupon and comparing the two it certainly appears that the Monoprice printer had everything the Printrbot did plus way more.

For instance, features like:

  1. Heated build plate
  2. Dual extruders
  3. 3 types of filament options
  4. Enclosed housing

seemed like a great things especially when the Printrbot offered a heated build plate for an additional $100.

So I took that bait and ordered the following:

  • Printer
  • PLA 1.75 White Filament
  • PLA 1.75mm Black Filament
  • 3D Printer Kapton Tape 150mm.x30mm
  • 33 ft USB Cable

Out-the-door final price:  $842.82!  Can’t beat that folks!

It is worth noting that the machine comes with a black and white spool of filament but I bought a couple more.  It also comes with a piece of Kapton tape on the build plate but I also bought some more.

After opening the box I pulled out the SD card that it came with and printed out one of the test models.  You can see that there are a few stray plastic hairs that messed up the print and some of the layers are not squeezed quite tightly enough in certain spots but all in all not bad.





My research online and asking questions to those more experienced leads me to lower the temp to 200 and try again.  I’ll post those results soon.


Understanding Document Sharing in SharePoint Online

SharePoint Online offers users the ability to share documents with people outside your immediate organization. Seems simple enough in theory but gets confusing in practice. Here are the key elements to understand if you find yourself confused like I was when testing.

In order for this to work you must first configure the SharePoint Online Tenant and Site Collection to allow sharing. I am not covering how to set that up in this article but you can learn about that here, here and here.

Steps to share a document

The first order of business is to obviously select a document you wish to share by right clicking the ellipses to expose the “Share” option.

Once the share dialog is opened enter the following information:

  1. The email of the person you are sharing with.
  2. Permissions to either edit or only view the document.
    (In my opinion, this incorrectly defaults to “Can Edit” instead of “Can View” so if you don’t want the person you are sharing with to edit you must remember to change this setting each time.)
  3. Enter a friendly message to go along with the email invitation.

Note that the “Get a Link” option on the left side of the dialog and the “Require sign-in” are available only if you have your SharePoint Online Tenant AND site collection configured to share using anonymous links. In most corporate environments you probably always want users to log in so you have control over who has access to your documents and this article assumes so. Otherwise, any user that has the link will be accessing your documents. The Site Collection and Tenant administrator have access to those settings so see them for further information.

After completing the dialog and pressing the “Share” button the user will receive an invitation email similar to the one below.

How External Users Log In

Assuming you require users to sign-in to view shared documents, the user can click the document link to be directed to a login page prior to access.

How users sign-in and what account they use was a point of confusion for me. Notice below the three login options.

  1. Microsoft Account: Use this option to use ANY Microsoft account the user may have. You may use Skype, Xbox, Hotmail, or anything Microsoft. This is the option to use if the “Organizational Account” option is not applicable and you have such an account.
  2. Organizational Account: Use this option if you have an Account specific to the SharePoint Online Tennant in which the Shared document resides. The term “Organizational Account” was and is confusing to me so I just think of it as the SharePoint Tennant specific account. It is possible that you belong to other SharePoint Online Tenants so login using the appropriate account.
  3. Create a Microsoft Account: This is the third option which I overlooked initially. If the user has no existing Microsoft accounts and no SharePoint Tennant specific account then just create an account on-the-fly using this option. Once it is created you can use it over and over again.


Choosing either option 1 “Microsoft Account” or option 2 “Organizational Account” will direct the user to a login page like this.

Choosing option 3 to create a Microsoft Account directs the user to an account creation page shown below. This form is straight-forward, just provide an email address and some other information for verification.

When the form is submitted a verification email is sent to the user and after verification is complete the user’s account is created and can proceed to access the document.


Managing Your Shares

As a site content owner you will want to manage what is being shared and with whom. There are two tools from which you can to do this.

Shared With Dialog

The first method is to use the ellipse menu associated with the document to expose the “Share With” option.

Note in this dialog, there are no option to remove users only view. However, when you click the “Advanced” link you will be taken you to the “Permissions” administration page for that document. The “Permissions” admin page is where you remove users to prevent further access.

Also note that shared external users will not be listed until they actually access it. Until accessed, the user are in a “Pending” list as demonstrated below.

Viewing all Shares

The method mentioned previously is fine for understanding shares with respect to a specific document, but what if you want to view all the shares for all documents? For that, go to “Site Settings” and select “Access Request and Invitations” in the “Users and Permissions” section. Don’t worry if you can’t find it immediately as it is only visible once the first invite is initiated and sharing is turned on for the Site Collection.


The Access Request page has three sections:

Pending Request

The first section shows any user that has requested access to something. When a user is selected in the list the top Ribbon menu will allow you to accept or reject the request.

External User Invitations

The second section shows any external user that has been invited to view or contribute to a document but not yet accepted. Selecting a user in this section enables the Ribbon options to Resend or Withdraw and invitation. Withdrawing the invitation will move the account to the History section mentioned below.


The third section is for historical reference. When an invitation is sent it is in a “Pending” status, it will eventually expire. Once the user accesses the document or the expiration date is reached you will find the account moved into “History” where it will reside for auditing.

It is interesting to note that the lists in each section provide the ability to create views (Note the “Modify this View” and “Create View” links) thereby allowing better organization of data when many records are present.

This article represents the core components and issues necessary for understanding how documents are shared with external users. Many of these concepts are applicable for sharing sites and folders as well.

Lastly, as with many testing exercises with SharePoint Online you are at the mercy of timer job frequencies which you have no control over. As such, at times is can be frustrating waiting for emails to be sent and dialog boxes actually show shared user accounts.

Thanks for reading and I look forward to your comments and questions.







Office 365 Premium == Happy Home Computing

For a measly $20 per user you can now power up your family of 5 with:

  • Word
  • Excel PowerPoint
  • OneNote
  • Outlook
  • Access
  • Publisher


Yes that’s right! 5 family licenses for $100 per year! I can suite up my kids and wife with all the tools for school, work and social and never worry about upgrades again! Me so happy!


If you are not familiar with Office 365 it is the same Microsoft Office you are familiar with only your license allows for you to access it not only from your Laptop of Desktop but also from within your browser, mobile phone, public PC’s or Macs. Upgrades just happen so you are always fresh with the latest and greatest features and unburdened from the hassle of buy and/or downloading and installing upgrades yourself (bit time saver). Documents can be stored on OneDrive so you a free from losing data when you’re hard drive fails. OneDrive makes is very easy to share a document with others. Just selected it and click “Share”. You can then share it with anyone you want and set permissions for them to either view-only or edit.


Each license comes with 20 GB of OneDrive storage making it extremely easy to share documents with each other and out-siders like teachers, friends, coworkers and extended family. Included are 60 minutes of Skype caller per month so if you make that many calls it almost off-sets the price of a single user license.


I am having great success with this product and would like to hear how or if it works for you!


Link to the Amazon download here.

Developing PowerShell Scripts with PowerShell Tools for Visual Studio

I think I have finally hit the sweet-spot in developing PowerShell Scripts. For the past few years now PowerGui has been my tool of choice but as libraries for SharePoint and Azure have become very much part of my every day I have found that tool not providing PowerPacks and other conveniences to get setup quickly. I have been frustrated trying to setup the correct references to so that Intellisense works with Azure and SharePoint – I just have not gotten it right. If you have pointers for me I would love to hear them.

So here is how I am setup know and I can tell you I like it a lot because it give me the opportunity to:

  • Use Visual Studio
  • Use Intellsense
  • Easily Integrate with TFS
  • Provides standard F5 debugging tools
  • Examine Variables in Debug Mode
  • Execute cmdlets directly

The “Immediate Window” is useless in this scenario so you have to rely on the Locals and Watch windows to examine variables. To execute cmdlets directly you just open the Nuget Package Manager Console which is itself a PowerShell console containing all the references you need – no additional setup needed.

To get things going do the following:

Building SharePoint Online Business Solutions using Document Sets – Part 1

This series of posts I will discuss how SharePoint 2013 Document Sets and other SharePoint features can be combined to provide a robust solution for managing numerous related documents like those related to activities such as project management, legal cases, contract negotiations, business proposals or whatever.

In this set of articles we will create a typical corporate Project Management Office (PMO) solution that involves structuring and grouping all document artifacts related to a given project together with as little administrative overhead as necessary.

The sub-goals of satisfying this requirement are also:

  1. Use out-of-the-box SharePoint (no-code capabilities as much as possible)
  2. Centrally initiate new projects and ensure all have common meta data, document structures and templates
  3. Populate the project with standard PMO document templates such as Project Charter, Status Reports etc.
  4. Automatically associate all project documents together
  5. Search, aggregate, filter Projects within and across Site Collections
  6. Leverage retention policies so projects can be archived for safe keeping

Create Content Types

Document Set Content Type

Firstly, we create a “Document Set” Content Type called appropriately: “Project Document Set.” By leveraging the Document Set’s synchronization feature we can achieve quite a bit of automation without code.

The “Document Set” Content Type is an out-of-the-box SharePoint Content Type with special powers for coordinating the properties among the Content Types it manages. This coordination of Content Types allows you to treat a group of Documents as though they are one. The benefit of this is that you can manage an entire groupings of documents as one unit. For, example moving all related documents to a Records or Document Center for archiving.

For detailed information about the Document Set Content Type please refer to this article from Microsoft.

The image below illustrates the concept more clearly. The green oval represents the Document Set Content Type with Content Types it manages inside. Notice the common fields marked with the red star. The Project Document Set is configured to synchronize the common fields thereby alleviating the burden of requiring the users to complete those fields over and over again with each new document that is added to the set.

There is a design decision here that you will need to make as far as where to define your Content Types:

You can define Content Types and Site Columns in either a single Site Collection or in the central Content Type Hub. The difference is that when defined in the Content Type Hub they are available throughout all the site collections in your SharePoint Farm using the Content Type Publishing feature. Otherwise the Content Type definitions are only available in the site collection in which they were defined. For more information about Content Type Publishing and the Content Type Hub refer to this article.

I’m not going to cover what a Content Type is or how to define them in this article. If you want to learn and understand them start here.

Shared Site Columns

The “Project Document Set” Content Type will have the following fields so those need to be defined in the Site Columns:

  1. Project Name – Single Line Text, Required
  2. Project Type – Managed Metadata, Required
  3. Project ID – Single Line Text, Required
  4. Project Status – Choice

Here you will want some of the fields as required. By doing so you can later use those fields in a Documents Center or Records Center to filing your documents using automatic rules. In subsequent articles I’ll demonstrate how to archive these Project Document Sets using automatic filing rules so I am marking the Project ID, Project Type and Project Name as required.

Project Management Document Types

Next, we create Content Types for the various document types that the Document Set will manage. For this I am using a few standard PMP document templates available at http://www.projectmanagementdocs.com.

I will create a Content Type for each document template I downloaded. The Parent Content Type for each of them will simply be SharePoint’s “Document.”

  1. Charter Template
  2. Communication Plan
  3. Status Report
  4. Work Breakdown Structure


Although I did not do it here, I find it helpful to first create a custom base Content Type derived from “Document” (call it “Project Document” or something) and then derive all the remaining Content types from that. This provides the flexibility of easily adding additional fields to all child Content Types by simply adding columns to be base.


Each of the four Content Types will utilize the same site columns we defined before creating the “Project Document Set.”

  1. Project Name – Single Line Text
  2. Project Type – Managed Metadata
  3. Project ID – Single Line Text

Reusing these same fields is important because, as mentioned earlier, SharePoint will automatically synchronize these fields across all the Content Types it manages.

Here is the completed Content Type definition for “Project Charter.” The other three Content Types will be configured the same.

Associate Individual Content Types to the Document Set Content Type

Now we have our “Project Document Set” Content Type defined along with the other Content Types it will manage.

Configuring the Document Set

To configure the “Project Document Set” click it from the list of Content Types and select Document Set Settings as shown.

Here is where all the magic happens.

Allowed Content Types

The first section called “Allowed Content Types” is where you associate Content Types that the Document Set will manage. Here I have selected a group of Content Types that contains all the PMP project Content Types.

Default Content

The next section called “Default Content” defines documents that will be prepopulated when a user creates a new “Project Document Set.” This is awesome because at any time an administrator can change or remove a document template insuring new projects will use the latest templates.

To configure, just select one of the Content Type in the dropdown, specify a folder name where the template will reside and upload a document. The document file is mandatory so if you just want to create an empty folder you won’t be able to. (In another post I’ll demonstrate a remote event receiver which deletes the file so the user sees only an empty folder).

Below is the newly created Project Document Set and notice how SharePoint creates the Folder(s) specified in the “Default Content” settings.

Drilling into that folder you will find a copy of each document that was also specified.

Shared Columns

The “Shared Columns” section lists the fields in the “Project Document Set” Content Type that will be synchronized with the Content Types it manages. Simply check the fields that need to be synchronized. Not only does the synchronizer work when a new Document Set is created but it also works when you update one of the synchronized fields. For instance if, after some time a project name is changed it can be updated on the Document Set and all child items will also be updated.

Continuing from the previous image; below are the properties of the files where the Project Name, Project Type, Project ID fields synchronized from parent Document Set.

Note that word “Frozen” in the Project Name is misspelled. To clean that up all that is necessary is to update the Project Name field with the correct spelling in the Project Document Set and SharePoint will synchronize that change to all the child Content Type’s it manages!


There are two primary issues with this out-of-the-box Document Set functionality that I don’t like. I found no way around these so I eventually ended up created a Remote Event Receiver to clean them up.

The first issue is the Content Type SharePoint assigns to the sub-folders. Remember in the Document Set Setting’s “Default Content Types” section, folders and templates can deployed when a new Document Set is created. Here I noticed that the assigned Content Type is the “Project Document Set” itself which means the sub-folders are themselves “Document Sets” and not the typical “Folder” Content Types that you would expect. Having the sub-folders assigned to my “Project Document Set” Content Type is problematic because we will get incorrect search results when querying the system for Projects.

Not only was the Folder Content Type wrong but I also noticed that the Project Name, Project ID, and Project Type fields are not synchronized. I experimented by attempting to allow my “Project Document Set” to also manage the “Folder” Content Type but it cannot be selected in the “Allowed Content Types” section of the Document Set Settings. As mentioned above I eventually developed a Remote Event Receiver to update the sub-folders so that they were “Folder” Content Types and also populated the values correctly.

The next big undesirable I found when working with Document sets was the “Name” field of the Document Set Content Type. Apparently, “Document Set” Content Types are derived from the “Document Collection Folder” Content Type which in turn are derived from “Folder.” This means that a Content Type is basically a “Folder” and it has a Required “Name” field. As such, the “Project Content Type” also must have a “Name” Field which is redundant to the “Project Name” field we defined. My solution was to have the user simply enter the Project Name in the “Name” field and then the remote Event Receiver would copy the value from the “Name” to the “Project Name” field.

Up Next

In subsequent posts I’ll demonstrate how to tune the search engine to display and filter these Projects and how they can be moved in their entirety to a Documents or Records Center using automatic filing rules..














Release of my latest Web Project: “Check In Now”

I don’t usually get to showcase my development work because the vast majority is for internal business and intranet usage. Today, however, is an exception as we are live with a project I have worked on over the past 4 months. It is an ASP.Net, HTML5 MVC4 application which leverages Entity Framework 6, JQuery, Titter Bootstrap, oAuth, Toastr and other plugins.

The Covenant Medical Group “Check In Now” application allows patients to view current wait times and pre-register at four urgent care facilities located in Lubbock Texas. This app will be expanded to cover many more facilities and regions of the St. Joseph Health System. Patients can complete paperwork at home, ahead of their arrival time. Doing so, fast-tracks their registration process and time to treatment once they arrive. Facility administrators have the ability of updating wait times, view incoming queue of patients and send emails and SMS text messages directly to arriving patients.



Analyzing Your Site’s usage

Recently I was asked if there was a web part that could added to a page which would track site visitors.

The answer to that is no.

However, each SharePoint site collection has a more powerful tool for understanding site usage. SharePoint tracks usage for you and that information is available in Site Settings Site Collection Web Analytics

Reports.   Within this area is a plethora or data including:

  • Top Pages – Pages most visited
  • Top Vistors – Users that visit your site most frequently
  • Top Queries – Search criteria that your users are using to find information on your site. This can be used in conjunction with Search Scopes and Search Keywords for fine tuning search results.
  • Trends and other statistics


Seeking new Web/SharePoint Leadership Opportunities

My most recent position with St. Joseph Health as Manager of Web & SharePoint development has been eliminated along with 30 or so other IT positions due to budget workforce reductions. I really enjoyed working there and have been concerned about this possibility for many months now. However, I wanted to see it through rather than leave because I really enjoyed it there.

So the good news is I am ready for the next challenge and wanted to be sure my readers were aware of this situation. I am seeking opportunities in technical leadership with focus on web and SharePoint development and strategy.

My resume can be found here http://tjo.me/13hH4lv. If you or others in your network need access to it just let me know and I’ll share it out if it is not already.

I really appreciate your support and if you hear of any openings that may be fitting please let me know so it can be evaluated.

Thanks for reading!

Sniffing SharePoint for Enterprise License Feature Usage

SharePoint Enterprise features provide users the ability to use advanced capabilities not found in standard SharePoint licensing. Once you have licenses for Enterprise features all you do is enable them on the site collections and users are free to consume them.

Once enabled, however, there is no easy way to determine if they are being actually used. Enterprise license are pretty expensive so you want to be sure you are getting your money’s worth.

One, painful way to determine usage is to manually thumb around sites looking for things like Excel or Visio Web Parts, InfoPath forms, Content Organizer Rules, Data Connection Libraries or Power Pivot libraries. That is not practical, nor accurate, nor fun.

In my case I have no fancy commercial tools to manage and administer my farm so I often resort to rolling my own PowerShell scripts. The following are some functions I wrote to detect and document the instances of Enterprise feature usages.

The first thing you need is a loop that iterate down through your farm Site Collections and sites. Something like this usually works fine.

$rootSite = New-Object Microsoft.SharePoint.SPSite("http://my.spdomain.org") ;
$spWebApp = $rootSite.WebApplication;
$allSites = $spWebApp.Sites;
foreach($site in $allSites){
    write-host "Site: " $site.Url;
    $subWebs = $site | Get-SPWeb -Limit all;	
    foreach($web in $subWebs){
        write-host "Web: " $web.Url;
        # Put detection code here
        $web = $null;
    $subWebs = $null;
$allSites = $null;
$spWebApp = $null;

Once this main loop is in place we can insert detection functions which analyze SPList objects of each site. With the help of some online research below are the functions I came up with. All of the functions accept a site URL and iterates over the SPList objects to determine if the condition I am looking for is found. If a SPList object contains the feature I seek, it is added to an ArrayList and returned to the calling main loop. The main loop collects all ArryLists and then exports them to CSV or whatever you want to do with them. I export to CSV and put those in MS Access for creating nice reports.

Detecting PowerPivot Libraries

The trick to this function is to analyze the TemplateFeatureId property of a document library and compare it to the feature ID’s for PowerPivot. The PowerPivot feature Id’s are specified in the code.

function Get-PowerPivotLibs
    [System.String]$siteUrl = "http://my.spdomain.org/team/Team_Site_Test"
$powerPivotFeatures = @("1a33a234-b4a4-4fc6-96c2-8bdb56388bd5", "e9c4784b-d453-46f5-8559-3c891d7159dd", "f8c51e81-0b46-4535-a3d5-244f63e1cab9")
# ArrayList to hold any found PowerPivot Libs
[System.Collections.ArrayList]$PPLibs = New-Object System.Collections.ArrayList($null)
$libs = Get-SPWeb $siteUrl |
    Select -ExpandProperty Lists |
    Where { $_.GetType().Name -eq "SPDocumentLibrary" -and -not $_.hidden }
foreach($lib in $libs)
    if(($powerPivotFeatures -contains $lib.TemplateFeatureId))
        $PPLibs.Add($lib) > $null
        Write-Host -BackgroundColor Blue -ForegroundColor White "PowerPivot Lib Part Found" 
return, $PPLibs;

Detecting Data Connection Libraries

This method is the same as used for detecting PowerPivot Libraries with the exception of a different feature Id.

function Get-DataConnectionLibs
    [System.String]$siteUrl = "http://my.spdomain.org/team/Team_Site_Test/InternalTeam/DocCtr/"
Write-Host -ForegroundColor Gray "Searching for DataConnection Libraries: "  $siteUrl
$dataConnFeatures = @("00bfea71-dbd7-4f72-b8cb-da7ac0440130")
# ArrayList to hold any found DataConn Libs
[System.Collections.ArrayList]$DCLibs = New-Object System.Collections.ArrayList($null)
$libs = Get-SPWeb $siteUrl |
    Select -ExpandProperty Lists |
    Where { $_.GetType().Name -eq "SPDocumentLibrary" -and -not $_.hidden }
foreach($lib in $libs)
    if(($dataConnFeatures -contains $lib.TemplateFeatureId))
return, $DCLibs;

Detecting Excel Web Parts

This one is a bit more complicated and requires first determining if the function is dealing with a Publishing site or not. For Publishing sites we have to get something called a “LimitedWebPartManager” from each SPListItem in the Pages library. The “LimitedWebPartManager” allows you to iterate over each web part on the page to detect if the any web part of type “ExcelWebRenderer.”

If we are not dealing with a Publishing site, then function looks through all SPList objects that have a “BaseTemplate” property of “WebPageLibrary.” Again using the “LimitedWebPartManager” we look for the same “ExcelWebRenderer” type web parts.

When we find even one “ExcelWebRenderer” web part in a page we add the SPList to the ArrayList which is returned to the calling loop.

function Get-ExcelWebParts
    [System.String]$siteUrl = "http://my.spdomain.org/team/Team_Site_Test/"
    # Reference Sites: 
    # http://sharepointpromag.com/sharepoint/windows-powershell-scripts-sharepoint-info-files-pagesweb-parts
    # http://support.microsoft.com/kb/2261512
Write-Host -ForegroundColor Gray  "Searching for Excel Web Parts: "  $siteUrl
$ExcelWebRenderer = "ExcelWebRenderer";	
# ArrayList to hold any found Exel Web part page s
[System.Collections.ArrayList]$ExcelWps = New-Object System.Collections.ArrayList($null)
$web = Get-SPWeb $siteUrl
$lists = $web |
    Select -ExpandProperty Lists 
    Where { -ne $_.hidden }
    $pWeb = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($web)
    $pages = $pWeb.PagesList
    foreach ($item in $pages.Items) {
            $manager = $item.file.GetLimitedWebPartManager([System.Web.UI.WebControls.Webparts.PersonalizationScope]::Shared);
            $wps = $manager.webparts
            $webPartCount = 0;
                foreach($wp in $wps)
                    if($wp.GetType().Name -eq $ExcelWebRenderer){
                        Write-Host -BackgroundColor Blue -ForegroundColor White "Excel Web Part Found" 
            if($webPartCount -gt 0)
                {$ExcelWps.Add($pages) > $null;}
        catch [system.exception]{
            #Write-Host -ForegroundColor Red -BackgroundColor Black "Error"
} else {
    foreach($list in $lists)
        if($list.BaseTemplate -eq "WebPageLibrary")
            foreach ($item in $list.Items) {
                    $manager = $item.file.GetLimitedWebPartManager([System.Web.UI.WebControls.Webparts.PersonalizationScope]::Shared);
                    $wps = $manager.webparts
                    $webPartCount = 0;
                    foreach($wp in $wps)
                        if($wp.GetType().Name -eq $ExcelWebRenderer){
                            Write-Host -BackgroundColor Blue -ForegroundColor White "Excel Web Part Found" 
                    if($webPartCount -gt 0)
                        {$ExcelWps.Add($list) > $null;}
                }catch [system.exception]{
                    #Write-Host -ForegroundColor Red -BackgroundColor Black "Error"
return, $ExcelWps;

Detecting Content Organizer Rules

Content Organizing Rules are Enterprise features usually found in Document or Record Centers. These rules are created by users to automatically route documents to their proper folder based on Meta data on the document. Look it up if you are not sure what they are I think you will find them pretty cool.

Content Organizer Rules are stored in HIDDENs list using a content type named “Rule” so we just iterate over the Hidden SPList objects looking to see if the “Rule” content type is being used. If it is, the SPList is put in my return ArrayList.

function Get-ContentOrganizerRules
    [System.String]$siteUrl = "http://my.spdomain.org/team/Team_Site_Test/InternalTeam/DocCtr/"
Write-Host -ForegroundColor Gray "Searching for Content Organizer Rules: "  $siteUrl
# ArrayList to hold any found DataConn Libs
[System.Collections.ArrayList]$CORules = New-Object System.Collections.ArrayList($null)
$lists = Get-SPWeb $siteUrl |
    Select -ExpandProperty Lists |
    Where { $_.GetType().Name -eq "SPList" -and  $_.hidden }
foreach($list in $lists)
    #Write-Host $list ;
    foreach($contenType in $list.ContentTypes){
        if($contenType -ne $null){
            if($contenType.Id.ToString() -eq "0x0100DC2417D125A4489CA59DCC70E3F152B2000C65439F6CABB14AB9C55083A32BCE9C" -and $contenType.Name -eq "Rule")
                Write-Host -BackgroundColor Green -ForegroundColor White "Content Organizer Rule found: " $list.Url>$null;
return, $CORules;

Detecting InfoPath Forms

There is probably a cleaner way to detect usage of InfoPath forms but this was what I found in a pinch. In my environment it worked for at least a very large portion of InfoPath usages. I noticed that when you examine a list that has a custom InfoPath form instead of the default, SharePoint Designer will display alongside the default NewForm.aspx and EditForm.aspx additional forms called newifs.aspx and editifs.aspx respectively. I assumed that the “ifs” suffix stands for “InfoPath Form Services” so I set out to detect those. In addition, Form Libraries can also use InfoPath forms and those can be detected by inspecting their BaseTemplate property. If “XMLForm” is the BaseTemplate then an InfoPath Form is being used.

function Get-InfoPathLists
	[System.String]$siteUrl = "http://teams.stjoe.org/team/Team_Site_Test"
	# ArrayList to hold any found Lists & Libs
	[System.Collections.ArrayList]$IFPLibs = New-Object System.Collections.ArrayList($null)
	$listsAndLibs = Get-SPWeb $siteUrl |
	   Select -ExpandProperty Lists |
	   Where { ($_.GetType().Name -eq "SPList" -or $_.GetType().Name -eq "SPDocumentLibrary") -and -not $_.hidden }
	foreach($list in $listsAndLibs)
				if( $list.BaseTemplate -eq "XMLForm" ) 
					Write-Host -BackgroundColor Green -ForegroundColor White "InfoPath List found: " $list.Url >$null;
			{"GenericList" -or "Survey"} 
				if($list.Forms -ne $null){
					foreach($form in $list.Forms){
							Write-Host -BackgroundColor Green -ForegroundColor White "InfoPath List found: " $list.Url>$null;
			default {
				Write-Host -BackgroundColor Red -ForegroundColor White "Unexpected List BaseType found: " $list.BaseType. $list.Url >$null;
	return, $IFPLibs;

Exporting to CSV

As mentioned before, the main loop collects all the ArrayLists returned by the detection functions for export to CSV (in my case pipe delimited rather than comma). Just select the properties of the SPList and send them to the Export-CSV commandlet. Since all the CSV’s have the same columns just import them into Excel or Access and make your report.

$InfoPathResults | Select URL, Title, ItemCount, Author, Created, LastItemModifiedDate | Export-Csv  C:\eCalSniffereCalSniffer\InfoPathResults.csv -NoTypeInformation -Encoding UTF8 -Delimiter '|'
$COResults | Select URL, Title, ItemCount, Author, Created, LastItemModifiedDate | Export-Csv  C:\eCalSniffereCalSniffer\COResults.csv -NoTypeInformation -Encoding UTF8 -Delimiter '|'
$DCResults | Select URL, Title, ItemCount, Author, Created, LastItemModifiedDate | Export-Csv  C:\eCalSniffereCalSniffer\DCResults.csv -NoTypeInformation -Encoding UTF8 -Delimiter '|'
$ExcelResults | Select URL, Title, ItemCount, Author, Created, LastItemModifiedDate | Select URL, Title, ItemCount, Author, Created, LastItemModifiedDate | Export-Csv  C:\eCalSniffereCalSniffer\ExcelResults.csv -NoTypeInformation -Encoding UTF8 -Delimiter '|'
$VisioResults | Select URL, Title, ItemCount, Author, Created, LastItemModifiedDate | Export-Csv  C:\eCalSniffereCalSniffer\VisioRsults.csv -NoTypeInformation -Encoding UTF8 -Delimiter '|'
$PPivotResults | Select URL, Title, ItemCount, Author, Created, LastItemModifiedDate | Export-Csv  C:\eCalSniffereCalSniffer\PPivotResults.csv -NoTypeInformation -Encoding UTF8 -Delimiter '|'
$PPivotResults | Select URL, Title, ItemCount, Author, Created, LastItemModifiedDate | Export-Csv  C:\eCalSniffereCalSniffer\All.csv -NoTypeInformation -Encoding UTF8 -Delimiter '|'

Final Code

. C:\eCalSniffereCalSniffer\get-InfopathLists.ps1
. .\get-InfopathLists.ps1
. .\Get-ContentOrganizerRules.ps1
. .\Get-DataConnectionLibs.ps1
. .\Get-ExcelWebParts.ps1
. .\Get-VisioWebParts.ps1
. .\Get_PowerPivotLibs.ps1
. .\Results-To-CSV.ps1
    $siteUrls = @("http://my.spdomain.org","http://carenet.stjoe.org");
    [System.Collections.ArrayList]$InfoPathResults = New-Object System.Collections.ArrayList($null)
    [System.Collections.ArrayList]$COResults = New-Object System.Collections.ArrayList($null)
    [System.Collections.ArrayList]$DCResults = New-Object System.Collections.ArrayList($null)
    [System.Collections.ArrayList]$ExcelResults = New-Object System.Collections.ArrayList($null)
    [System.Collections.ArrayList]$VisioResults = New-Object System.Collections.ArrayList($null)
    [System.Collections.ArrayList]$PPivotResults = New-Object System.Collections.ArrayList($null)
    [System.Collections.ArrayList]$AllResults = New-Object System.Collections.ArrayList($null)
    foreach($siteUrl in $siteUrls)
        $rootSite = New-Object Microsoft.SharePoint.SPSite($siteUrl) 
        $spWebApp = $rootSite.WebApplication
        $count = 0
        write-host -ForegroundColor Gray "Site Collections"
        write-host ""
        foreach ($site in $spWebApp.Sites) 
                write-host -ForegroundColor Gray "Site Collections URL: " $site.URL
                write-host ""
                write-host -ForegroundColor Gray "SubSites"
                foreach ($web in $site.AllWebs) 
                    write-host -ForegroundColor  Gray "SubSite URL: " $web.URL 
                    $InfoPathScan = get-InfopathLists $web.URL;
                    if($InfoPathScan.Count -gt 0){
                    $CORulesScan = Get-ContentOrganizerRules $web.URL;
                    if($CORulesScan.Count -gt 0){
                    $DataConnScan = Get-DataConnectionLibs $web.URL;
                    if($DataConnScan.Count -gt 0){
                    $ExcelWpsScan = Get-ExcelWebParts $web.URL;
                    if($ExcelWpsScan.Count -gt 0){
                    $VisioWpsScan = Get-VisioWebParts $web.URL;
                    if($VisioWpsScan.Count -gt 0){
                    $PowerPivotScan = Get-PowerPivotLibs $web.URL;
                    if($PowerPivotScan.Count -gt 0){
    write-host "Total Count :" $count
    Write-Host "Exporting CSV"