Assess an enterprise and consolidate assessment reports with DMA

Francesco Mantovani
7 min readAug 30, 2021

This post has the same title as the official Microsoft guide Assess an enterprise and consolidate assessment reports with DMA.

The page was lately removed but luckily I was able to find it on the WayBack Machine.

Microsoft is use to drop tools without forecast, like Database Diagram for example.

So if you need to migrate your SQL Server estate to Azure and you want to see the big picture through Power BI look no further: in this page you will find all the needed tools.

Prerequisites

You need to download these three tools:

PowerShell-Modules: If the link is broken download it from here

LoadWarehouse script: If the link is broken download it from here

DMA Reports Power BI template: If the link is broken download it from here

Keep in mind that you can also find the three files in this discussion: https://techcommunity.microsoft.com/t5/microsoft-data-migration/assess-an-enterprise-with-data-migration-assistant-update/bc-p/2416149#M188 , until the post will be nuked by Microsoft

Setup the PowerShell-Modules

Download the .zip and put the content into the folder C:Program FilesWindowsPowerShellModulesDataMigrationAssistant

PowerShell Modules

You may need to unblock the PowerShell files after you save them to the WindowsPowerShell directory to ensure that the modules to load correctly. To unblock a PowerShell file, right-click on the file, select Properties, select the Unblock text box, and then select Ok.

psm1 file properties

Importing the DMA output files

If you have used the DMA tool in bulk to assess your SQL Server estate you now have the output saved in JSON.

We are going to run this command to import them to SQL Server. This command is also going to create two databases DMAReporting and DMAWarehouse

dmaProcessor -processTo SQLServer -serverName “localhost” -CreateDMAReporting 1 -CreateDataWarehouse 1 -databaseName DMAReporting -warehouseName DMAWarehouse -jsonDirectory C:temp

These are the parameters:

Parameter

Description

processTo

The location to which the JSON file will be processed. Possible values are SQLServer and AzureSQLDatabase.

serverName

The SQL Server instance to which data will be processed. If you specify AzureSQLDatabase for the processTo parameter, then include only the SQL Server name (don’t include .database.windows.net). You’ll be prompted for two logins when targeting Azure SQL Database; the first is your Azure tenant credentials, while the second is your admin login for the Azure SQL Server.

CreateDMAReporting

The staging database to create for processing the JSON file. If the database you specify already exists and you set this parameter to one, then objects don’t get created. This parameter is useful for recreating a single object that has been dropped.

CreateDataWarehouse

Creates the data warehouse that will be used by the Power BI report.

databaseName

The name of the DMAReporting database.

warehouseName

The name of the data warehouse database.

jsonDirectory

The directory containing the JSON assessment file. If there are multiple JSON files in the directory, then they’re processed one by one.

The import should take few seconds to execute but I’ve also seen it run for minutes.

The speed is not determined by the size of the JSON files

Consuming the assessment JSON file

If you incur in the error Error writing results for file C:***** to database I have found a workaround: copy the JSON file to another machine and fire the import there.

This is definitely an environmental problem and I have seen it happen sometimes. Just use another computer.

Loading the data warehouse

Now that the two databases are created we need to run the LoadWarehouse script. This will take the data from the ReportData table in the DMAReporting database and load it into the DMAWarehouse.

As soon as you open the script you will face this suggestion:

/*

IMPORTANT — Before running the script, press CTRL+SHIRT+M to set the database name parameters.

RUN EACH STEP INDIVIDUALLY.

TAKE CARE TO READ THE COMMENTS.

*/

So click CTRL+SHIRT+M and set the name of the two database you have created.

In my case I’m going to use the default names DMAReporting and DMAWarehouse.

If you have chosen a different name it’s now time to change the variable

LoadWarehouse variable

This trick will allow you to set the name of the two database all over the LoadWarehouse script.

As you can see the script is divided in 4 steps and they suggest to run each step individually.

— Step: all good, it will run without problem

— Step 1 — cleans object types. (this should be a temp step until tool is fixed)

UPDATE ..ReportData

SET ImpactedObjectType = ‘Database Options’

WHERE ImpactedObjectType = ‘DatabaseOptions’

— Step: same here, no problems

— These scripts use 3 part names. Ensure you update your database names

— Step 2 — Capture any missing rules

INSERT INTO ..dimRules (RuleID, Title, Impact, Recommendation, MoreInfo, ChangeCategory)

SELECT DISTINCT rd.Ruleid

, rd.Title

, rd.Impact

, rd.Recommendation

, rd.MoreInfo

, rd.ChangeCategory

FROM ..ReportData rd

LEFT JOIN ..dimRules dr

ON rd.RuleId = dr.RuleID

AND rd.Title = dr.Title

WHERE dr.RuleID IS NULL

— Capture any missing object types

INSERT INTO ..dimObjectType (ObjectType)

SELECT DISTINCT rd.ImpactedObjectType

FROM ..ReportData rd

LEFT JOIN ..dimObjectType ot

ON rd.ImpactedObjectType = ot.ObjectType

WHERE ot.ObjectType IS NULL

— Step: here comes the problems. If you run the third step from the official LoadWarehouse you will incur in a bug.

You first have to go to the table dbo.FactAssessment > Design, and allow NULLs for the SourceCompatKey.

Sometimes the TargetCompatKey might be affected too.

SourceCompatKey

You can now execute the third step after you uncomment the INSERT INTO

— Step 3 — Run select statement first to ensure no NULL keys (except dbowner which is expected to be null at this point).

— IMPORTANT — Once happy uncomment out the INSERT statement and run again

— INSERT INTO ..FactAssessment(DateKey, StatusKey, SourceCompatKey, TargetCompatKey, Categorykey, SeverityKey, ChangeCategorykey, RulesKey, AssessmentTargetKey, ObjectTypeKey, DBOwnerKey, InstanceName, DatabaseName, SizeMB, ImpactedObjectName, ImpactDetail, AssessmentName, AssessmentNumber, ServerVersion, ServerEdition)

SELECT dd.DateKey AS “DateKey”

,ds.StatusKey AS “StatusKey”

,sc.SourceCompatKey AS “SourceCompatKey”

,tc.TargetCompatKey AS “TargetCompatKey”

,dc.CategoryKey AS “CategoryKey”

,dsev.SeverityKey AS “SeverityKey”

,dcc.ChangeCategoryKey AS “ChangeCategoryKey”

,dr.RulesKey AS “RulesKey”

,AssessmentTargetKey AS “AssessmentTargetKey”

,ot.ObjectTypeKey AS “ObjectTypeKey”

,dbo.DBOwnerKey AS “DBOwnerKey”

,dma_rd.InstanceName AS “InstanceName”

, AS “DatabaseName”

,SizeMB AS “SizeMB”

,COALESCE(ImpactedObjectName, ‘NA’) AS “ImpactedObjectName”

,COALESCE(ImpactDetail, ‘NA’) AS “ImpactDetail”

,AssessmentName

,AssessmentNumber

,ServerVersion

,ServerEdition

FROM ..reportdata dma_rd

LEFT JOIN ..dimDate dd

ON CONVERT(CHAR(8),dma_rd.ImportDate,112) = dd.

LEFT JOIN ..dimStatus ds

ON dma_rd. = ds.

LEFT JOIN ..dimSourceCompatibility sc

ON dma_rd.SourceCompatibilityLevel = sc.SourceCompatibilityLevel

LEFT JOIN ..dimTargetCompatibility tc

ON dma_rd.TargetCompatibilityLevel = tc.TargetCompatibilityLevel

LEFT JOIN ..dimCategory dc

ON dma_rd.Category = dc.Category

LEFT JOIN ..dimSeverity dsev

ON dma_rd.Severity = dsev.Severity

LEFT JOIN ..dimRules dr

ON dma_rd.RuleId = dr.RuleID

AND dma_rd.title = dr.Title — there is a ruleid being used for 2 different titles

LEFT JOIN ..dimAssessmentTarget ast

ON dma_rd.AssessmentTarget = ast.AssessmentTarget

LEFT JOIN ..dimChangeCategory dcc

ON dma_rd.ChangeCategory = dcc.ChangeCategory

LEFT JOIN ..dimObjectType ot

ON CASE WHEN dma_rd.ImpactedObjectType IS NULL OR dma_rd.ImpactedObjectType = ‘’ THEN ‘NA’ ELSE ImpactedObjectType END = ot.ObjectType

LEFT JOIN ..dimDBOwner dbo

ON dma_rd.InstanceName = dbo.InstanceName

AND dma_rd.Name = dbo.DatabaseName

where IsLoaded = 0

— Step: This step is optional and I never use it

— Step 4 — update database owners

— This step is optional. This will enable the powerbi reports to filter on specific teams / owners

— Populate database owners

— Repeat for every instance / database

INSERT INTO ..dimDBOwner (InstanceName, DatabaseName, DBOwner)

VALUES (‘ExampleInstance’, ‘ExampleDatabase’, ‘ExampleOwner’)

— Once DBOwner dimension is populated update factassessment table

UPDATE fa

SET fa.DBOwnerKey = db.DBOwnerKey

FROM ..FactAssessment fa

JOIN ..dimDBOwner db

ON fa.InstanceName = db.InstanceName

AND fa.DatabaseName = db.DatabaseName;

DMA readiness with Power BI

And here we are at the last step.

Double click on the DMA Reports Power BI template that will prompt a window in which you can enter your ServerName and the Database Name:

DMA Reports Power BI template loaded

The Power BI template will automatically load the data from the DMAWarehouse table.

Don’t forget you can select multiple databases by using CTRL + Left click.

If you right click on one database and go to Drillthrough > Remediation Plan , you will see all actions you need to take to finalize the migration.

Conclusion

I’ve been using Power BI since 2016 and it has been getting better every year.

I like the way we can easily create dashboards and share them. Much easier than SSRS.

Is good to see that Microsoft was able to create a simple template that allows you to check at a glance the job you have to do before migration. But don’t get fooled: you still have to check for false positive like you have to do for every software that does bulk assessments.

Unfortunately this useful template is no longer maintained and one day it will not work any more.

In exchange Microsoft is proposing the Azure Migrate option which is still in preview.

…and you know what “preview” means for Microsoft, right?

https://www.gotknowhow.com/media/article/images/2009/01/29/79-original-error_reporting_send_window.jpg

…you are the tester, my dear.

--

--

Francesco Mantovani

Lorem Ipsum “Neque porro quisquam est qui dolorem ipsum quia dolor sit amet, consectetur, adipisci velit…” “There is no one who loves pain itself, who seeks