SQL Server Integration Services Dashboard

A nice and useful HTML5 SSIS Dashboard

Davide Mauri
6 min readApr 18, 2017

I haven’t been working on Integration Services lately, but presenting at the SQL Saturday #613 in Redmond, WA, gave me the excuse to improve more and more the HTML 5 Integration Services dashboard I started to develop back in 2014 as an effort to learn Python.

http://ssis-dashboard.azurewebsites.net

With the displayed data you have everything that’s needed to figure if your SQL Server Integration Packages are running fine or if executions are failing.

How? Well, since version 2012, SQL Server Integration Services, log and interesting amount of data in the SSISDB. While *all* the information needed to figure out why a package is not running as expected, be it something that drives the package to fail or to perform poorly, are there, it may be a little complex and time consuming to figure out what each table contains and how to use it. And, in addition to that, it would be nice to just have a dashboard that with just one glance, can tell you if you’re good or you’re going to have a long day. It the latter case it would be even better if it can also help you to understand what’s going on.

The dashboard shows a complete picture of the package execution status, current and in the early past

The SQL Server Integration Services Dashboard, does exactly this. Here’s the feature list

Engine KPI

Quickly tells you how many packages in the last days (value is configurable in the configuration file) were in one of the well-known status. Ideally you would like to see that all packages executed were executed successfully. Each tile is clickable and allows you to filter the packages by status.

The “Running” tile also tells you how many packages are being executed right now.

Package Execution Chart

It shows, day by day, the number of packages that has been asked to be executed (“created” here means that the execution context was created and that engine was ready to execute the package), the executed packages, how many succeeded and how many failed the execution. Ideally here you’d like to see a wall of green : all executed package were executed successfully.

Package Execution Table

The table shows details of the last packages executions. For each execution you can see if it was tied to a specific environment, which logging level was used, how much time it took to complete and how many warnings and errors it raised.

The first four columns contains clickable values. By clicking on the first columns, where the execution id is displayed, you’ll be taken to the execution details page. By clicking on the Project columns, you’ll filter everything to show only packages contained in that project. The Package column will bring you to that execution history page. And by clicking on the Info column you will be shown the execution values page.

Execution Details

As the name implies, the execution detail page will give you access to detailed execution information. On the top there are tiles that shows you how many events were logged, how many errors and warnings. Two warnings have dedicated tiles: Memory Warnings and Duplicate Warnings.

A memory warning is issued each time, for any reason, the Integration Services engine cannot allocate enough memory and thus has to swap to disk or throttle the data flow. In both cases it means bad performances, so you really want to keep an eye on that.

A duplicate warning is issued any time a lookup transformation in a Dataflow find a duplicate key. That Dataflow will continue to run but it may produce non-deterministic results. If it happen that you have a duplicate warning, you surely want to check why duplicate lookup keys are appearing in your system.

Clicking on any tile will open a page where you can see the log, to be able to read the full log messages:

The Executables table shows all the executables (Tasks and Containers) and the total time each executable was running. This can help you to figure out what executable is taking most of the elapsed time, so that you can try to optimize it. It also shows if that executable was executed correctly or not.

Execution history

The execution history page shows details about the last successful executions of the selected package along with a chart showing the elapsed time values and the related moving average.

The moving average is also used to estimate the execution time of the current execution, if the package is being executed.

The estimation is really simple as you can see, and it will work well, being based on a moving average, only if the package executions doesn’t have any kind of periodicity and their execution times are pretty consistent.

Now the SQL Server supports R it would be great to have a better estimation algorithm. I would probably go with ARIMA to start with and then also add information about the volume of rows processed in order to improve model accuracy. I don’t know when I’ll be able to do that so, if you want to help, head over GitHub, fork the project and go with it.

Execution Values

The execution values page shows the values set for the package at run-time. Values can be set via Parameters, directly changing the Connection Manager properties or overriding properties values manually. Parameters values can be set ad-hoc or referencing an Environment.

In any case, the values set for that specific execution will be visible here:

Navigation Toolbar

The navigation toolbar has a drop-down menu the allows you to easily navigate your folders and projects:

Each element is clickable and will filter the data for the specific value.

Source Code

Source code is available on GitHub

https://github.com/yorek/ssis-dashboard

and it uses Python 3.5, HTML 5, Bootstrap, jQuery, and other nice web technologies. Information on how it download, configure and run it are on GitHub too.

I’ll be more than happy if this project really becomes a community project. If you’re interested in helping the development, please let me know so that we can define and prioritize features and development.

Enjoy!

--

--

Davide Mauri

Data Geek, Storyteller, Developer at heart, now infiltrated in Azure SQL product group to make sure developers voice is heard loud and clear. Heavy Metal fan.