File Export with the Help of Azure Active Directory Application

File Export with the Help of Azure Active Directory Application

Problem Statement: Uploading individual files using the Data Import Export Framework (DIXF) can be a tedious task, especially when you have to deal with thousands of files. Therefore, it is necessary for an organization to have a mechanism that allows them to periodically export files from their local workstation to D365, so that the updated database is reflected based on the content in the exported file.

Introduction: Microsoft Dynamics 365 provides an out of the box solution to handle files via its data management platform, as well as integration with Azure Active Directory via system administration. The data management platform handles files import & export, mapping of data files to AX tables, and asynchronous integration and recurrence scheduling. The following situation can be implemented in stock management, time & attendance management, and much more business processes.

Know your data: To start, we need to know the content & the file type that is going to be uploaded (for the sake of this blog we are going to suppose a time & attendance scenario that imports an EXCEL file with 4 columns [personnel number, date, time in, time out]).

Create a table: After knowing the data that is going to be exported from your local workstation, we need to create a table so that it can store the exported data for persistent storage or further business processing. The table from the above scenario should look like this:

Create a Data Entity: The data entity would allow staging (Extract, transform, and load [ETL] process) and mapping of data (field to field mapping between file & AX table) that is imported from our Service to ImportAttendance table. To create a data entity, Right click the table and select create data entity from the Add-ins submenu. Note: Make sure that you have a primary key (index) on the table for data entity to be created.

Setting up Data management platform: Navigate to Workspaces > Data management. Click import to create an import job. Set the Group name, Description, Data project operation type, and Project category. After that click the +New File button, It will prompt for a Data entity name (which is the one that we just created), format data source (EXCEL in this case, if your data source format doesn’t exist then you’d have to create one in the Data management > Data source format management). Now browse and upload a template excel file with header data and 2-3 template records for DIXF to understand mapping process automatically (else it will fail, and you’d have to do it manually).

In case the mapping fails, click the map column’s checkFailed icon to open the mapping table, it would look like:

To manually complete the mapping, drag the field and join them via dots. After completing mapping, the map source to staging should look something like:

After mapping is complete. Go back to import job and click Create recurring data job button in the action pane of the page and fill the information as (Note that you’d need a Azure Application Id as an authorization policy, which is going to be explained later on in this blog):

After setting the configuration, we want to setup recurrence of this data job, therefore we click set processing recurrence button on the top of the dialog, and fill in the information like this (in this example the job will start on 22nd Nov 2017 at Midnight and run after every 12 hours):

Once you have completed the step by pressing OK and enabled the recurrence data job. Navigate to Workspaces > System administrator > Data management IT > Recurring data jobs & then click the job that you just created.

We can see that the data job has been created. Here it maintains its execution logs, authorization policy, and further information for debugging each task that this job performs. Now that we have setup our recurring data job in AX, it’s time to setup azure active directory using azure’s portal and C# desktop application (Note that the ID for the newly created data job is important as it will be used to uniquely identify which data job should the exported file be sent to for further processing).

Setting up Azure Active Directory (AAD): To setup azure active directory to support native application (or web api) go to Here a lot of cloud services are open for developers to aid in their development implementation, but all we need is azure active directory so we’d be using that.

We want to register our native application to azure cloud so that it can export files to its storage. Therefore, we would continue by clicking App registration in AAD, and then selecting the +New application registration button in the header section of application names list. Now we select a name, application type and sign-on url, which should look similar to something like:

After creating our application, we would be thrown back to the application names list page, here we would search for our newly created application and click it. Which would lead us to a page similar to:

Here, Application Id is of most importance as it would be used for integration services and file import-export integration reference. Now that we have our application ready, we need to do two more things.

  1. Give permissions to this native app so that it could communicate with MS dynamics server via Application Id generated back when we created our recurrent data job. Therefore, Click add to add permissions, and select Microsoft Dynamics ERP in Select an API option. Click select, and then check every permission displayed by the API. Click done to finish and add MS Dynamics ERP to your permissions list. Finally click grant permission button at the top of the header to finish this process.
  2. Add Owner (optional): In case you want other personnel to access this application using their own account information, their information is required to communicate with the AAD via desktop application.

Congratulations! AAD is setup to integrate with D365, but still a problem remains. D365 doesn’t support external API calls and therefore would reject them. Therefore, Microsoft implemented an out-of-the box functionality to integrate with AAD. For this to work, we’d go back to D365’s main page and Navigate to Modules > System administration > Setup > Azure active directory applications. Once there, we would create a new record and set the Application Id generated by the AAD application as the Client Id and save it. Once saved, now we are sure that D365 will freely communicate with AAD application without any problem. The AAD application’s page would look something similar to:

Setting up C# Desktop application: To ease your burden, a template application already exists and was developed by MS itself as a template for developers to use based on their requirements. The application can be found and downloaded at: After downloading and extracting the files, open visual studio and open the DIXFSample.sln. After the files are loaded, open the App.config file (this is the only file that needs to be configured as it contains dynamic information that depends upon users and their generated Ids). Aftering configuring the config file, it should look similar to: 

Now run the application to cross reference check the values that you’ve added

After verifying everything as correct, go to C:\Temp\Input directory and paste the file that you want to export. Come back to the application and go to Executions tab and click Start. The result would be as (This screenshot indicate that the file was correctly enqueued to AAD):

The file is successfully send to AAD application and will reside there until our recurring data job runs and imports it into AX for mapping, staging process.

Results: The result of our imported file can be seen at the recurring data job screen (Workspaces > System administration > Data management IT > Recurring data job). The result would look similar to:

Extra information about the file can be seen by clicking execution details. Here all the information is logged on whether the file data was correct, if incorrect then at what stage import job failed (mapping/staging/etc). If you would open the Staging or AX mapped table, you would see those being filled with imported data (incase data import was successful).

Congratulations! You have worked step by step to create a recurrent import job to handle exported data from your local machine. The following is a discussion about minor problems that could occur while implementing this scenario.

  1. Make sure that Microsoft Dynamics Batch Management Service is running. Why? Since data jobs are processed asynchronously in the background they require the batch service, without it running the recurrent data job will never import and process the exported file from your local computer.

  2. In case of an error like “File data to target entity data project did not succeed”. Make sure that Microsoft.WindowsAzure.Storage is up and running. To solve this, follow the steps:
    • Run command prompt as administrator
    • Navigate to “C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator\” (default path)
    • Type the command: “AzureStorageEmulator.exe start”

Closing: This blog was about exporting any type of file to D365 using an application on a local machine with the help of azure active directory. I hope that this walkthrough have equipped you with enough information that you’d be able to interact with any application outside of D365 using the Azure Active Directory Application. 

Danyal Imran's picture
ERP Implementation Services at Systems Limited with half a year of implementation experience in Microsoft Dynamics 365 for Finance and Operations. Danyal has implementation experience in Human Resources, Payroll and Time & attendance modules. Danyal is based in Karachi.

Disclaimer: The views expressed here are solely those of the author in his private capacity and do not in any way represent the views of Systems Limited, or any other entity related to Systems Limited.

Add new comment