Let’s face it, every shop can benefit from automation. Automation with MicroStrategy System Manager improves reliability and efficiency. What do I mean? Well, let’s say you have very important executive reports that need to be sent every Sunday but you do not want to send the reports in the event that a data load fails because then they will be wrong. In addition, if the data load fails and the reports do not go out, you would like the BI team to be notified. This is a very common type of requirement in the wonderful world of BI. So, what should you do? Develop a custom process outside of MicroStrategy? No, this is a perfect scenario that can be handled by System Manager.
Let’s get started developing a process with System Manager to fulfill this requirement.
Step 1. Open a new workflow in System Manager
Open System Manager and click File->New Workflow
Step 2. Determine if the current day is Sunday
In System Manager in the upper left, click the question mark icon and then click anywhere on the empty white space in our new workflow to create a new decision object. Now, click the white mouse icon to the right of the question mark; other wise every time you click inside the workflow designer a new decision object will be created.
This is an important object because it will help you control the flow of the process. This object can check for files, days of the week or other parameter values. For now we are going to use this to check if the current day of the week is Sunday.
Now, select the new decision object and a list of properties should appear on the right. Choose the option “Weekday(s) Check” and check the option for “Sunday”. Also, rename this object “Sunday” by right clicking and choosing rename.
Step 3. Determine if ETL data load succeeded or failed
Usually in BI data loads are performed by an ETL tool with processes that are developed by the ETL team. It is common practice to set some sort of flag upon successfully loading the data. In my current environment our ETL team will create what we call a “trigger file”. This is simply an empty file the ETL process will create in a specific directory after the data is successfully loaded. The presence of this file lets other processes know that the database is up to date with the correct data. Another approach is to set a database value in a table after the data is loaded. Either approach will work with System Manager. For now, let’s assume the ETL process creates an empty file upon successfully loading the data.
Now, in System Manager in the upper left, click the question mark icon and then click anywhere on the empty white space in our new workflow to create a new decision object. Next click the white mouse icon to the right of the question mark; other wise every time you click you will keep creating new decision objects.
Next select the new decision object and a list of properties should appear on the right. Choose the option labeled “File/Directory Check” and enter the full path to the trigger file. Also, make sure the drop down box labeled “File/Directory Check Condition” has the value “Exists”. This means that if the file exists this object will return SUCCESS, otherwise it will return FAILURE. Rename this object “Check flag”.
Finally, click the green arrow button in the upper left menu, then click the “Sunday” decision object and drag your mouse pointer over to the “Check Flag” decision object. This will create a link between the two objects where if the Sunday decision object returns SUCCESS, the process will then move on to the next step and execute the Check Flag decision object. Your process should look similar to below.
It is also worth mentioning that if the file is available via FTP, we could use the File Transfer Download object instead. Also, if the ETL process sets a database value after loading the data instead of creating an empty file, we could use the Execute SQL task to determine if the data load was successfully. The point is that there are many ways to handle this part of the process. It really depends on your environment, however, the methods I have pointed out are the most common I find.
Step 4. Execute the reports
To execute the reports for this example we are going to use a Narrowcast Server object. In the left tree menu under MicroStrategy Products->Command Manager click “Narrowcast Server (Windows Only)” and drag it over to the workflow designer to create a new object. Next enter the logon info for your Narrowcast meta data and choose the option to “Execute script statements” and enter the script command to execute the Narrowcast job you would like to run. See below example.
Next click the green arrow button, then click the Check Flag decision object and drag your mouse pointer to the new Narrowcast object. Your workflow should now look similar to below.
At this point our workflow will check if the current day is Sunday and if it is it will check that the data has been loaded based on the existence of a file and if the file exists, the process will execute the executive reports via Narrowcast.
5. Alert BI team if data load fails
Currently the workflow will execute the report job if every works but what if the data load fails and the trigger file does not exist? Now we need to add logic to handle failures.
In the left tree view under System Tools folder, click the “Send Email” object and drag it over to the workflow. Next click the red arrow and click the “Check Flag” decision object and drag your mouse pointer over to the new “Send Email” object. Rename the Send Email object “Alert BI Team”. Finally, enter the send email properties such as From, To, Body, Subject and you will probably need the SMTP server info for your companies SMTP server. The finished workflow should look similar to below. To run the workflow, click the lightning bolt located in the top menu.
Finally, save the workflow with a name of your choosing and that’s it! Now, this exact solution may not work for everyone because I have assumed that you send reports using Narrowcast or that your ETL process sets some sort of flag if it fails or even that you have an SMTP server to send email. A lot of this stuff is pretty standard but the main point of this post is to introduce you to the kinds of things that can be handled by System Manager to make your environment a little more manageable and efficient.
In my next post I show you how to schedule this System Manager workflow to run automatically.
Please share your comments!