In the previous post, we have reviewed how to utilize data templates in Dynamics 365 Fin Ops.
Essentially, when you export a data project that consists of data templates, it is created as a data package. In this post, let us review, how we would be able to utilize the data task automation to be able to use these templates, what are the benefits using the data task automation and how we can leverage LCS in utilizing this tool.
Let us first review, about how the export of a template would look like. In the data management workspace, if you create either an export or an import project, you would see that there is an option “Add template”.

When you choose the option, you can see that the data entities that are added in the template, in the sequence that is presented there, show up here.

Once you export this data project, and download the package, you can see the structure that is similar to a normal data package that you generate when you add data entities manually and import them.


These data packages that are generated utilizing the templates can be utilized in data task automation. Before we jump into how to use data task automation, let us first understand, what it is and what purpose it solves from an implementation perspective.
Many a times, during our implementation process we must have always thought or even solved the pain point of being able to have the data packages loaded using a single click once we are able to load them into a centralized repository.
Data task automation, intends to solve this, allowing implementation teams – be it partners or customers to be able to automate the data migration, configuration migration strategy, leveraging OOTB data import export framework functionality and working with LCS hand in hand.
Furthermore, for customers and partners with understanding of matured ALM (Application Lifecycle Management) processes, data task automation manifest can be managed using the source control.
And by no means, data task automation is only for configuration and data migration strategy. You could extend this further to be utilized as a validation tool for your export or integration scenarios. Note that, for such validation activities, it is not recommended to be used in production where any of the import or export tasks use/dependents on APIs. Any data task automation that utilize/involves an API should be strictly used for automated testing.
Here is a screenshot of data task automation form, which is located under the Data management workspace.

As you see, there are 2 options that are highlighted:
- Load tasks from file: This option allows you to load the manifest file from a local drive. This means that the Solution Architect or the Functional lead should be owning this process. Remember the key reason or idea of having the data or configuration migration being automated is to have a better control and allowing multiple people to maintain different manifests would result in chaos. Of course, if there are teams that are split according to workstreams (Supply Chain, Finance, O2C, P2P etc.,) then it would make sense to allow each teams to maintain their own manifests, but even in that case the recommended approach would be to have one contact person handling this.

2. Load default tasks: This is where utilizing data task automation along with a defined ALM strategy would come into play. Customers and Partners who understand the importance of having a configuration and data migration controlled. These tasks/manifests in which the tasks are embedded, are maintained as “Resources” under AOT. So once you have a manifest ready, all you need to do is have the manifest checked in as a resource by a developer into the Resources and the next time you see your manifest being available here.


Now that we understand what and how data task automation can be utilized and what are various ways to manage a manifest, let us review what a manifest looks like and use it in a sample test to load using an LCS project. In my case, I’m going to load the resource/manifest from my local machine.
Structure of manifest
Manifests are classified into 4 groups:
- Task Manifest: This is the core structure using which we can define the name of the manifest, and the common attributes/parameters/behaviors shared for all the tasks within the manifest file.
- Data files: This defines all the files that will be loaded using LCS shared asset library or LCS’ project asset library.
- Data project definition: This defines the data project definition, if it is an import data project or an export, if an import if it is an async or an batch configured, the legal entity where the data needs to be imported, so and so forth that are basic configurations on the data projects.
- Entity setup: This defines the characteristics of individual entity that are part of the package. For example you may want to choose different kind of processing for different entities, like setting up default values at staging level, if staging needs to be skipped etc.,
More detailed explanation of manifest is here and there is a tech talk that walks through the use data task automation at detail.
I have a package that has Vendor group and Payment terms as data entities in it and this is loaded into LCS project’s asset library.

And my manifest looks like below:

I load this manifest into the data task automation framework and I see the attributes that I have provided would appear here:

I select the data that is in this form and click on “Run tasks”.

And you can see a new data project for import is created under Data management workspace which is basically concatenation of the ID, and the Title.

When I switch to the Data task automation framework screen, you can see that it is marked as “Completed” and “Passed” and you can verify the results under the “Show validation results”


So to sum it up:
- You can use data task automation to streamline your configuration imports.
- Maintain a manifest that will allow you to import for different companies.
- Maintain a single repository in LCS and control the manifest using the source control.
In the next and final part of this blog series, let us review how to utilize RSAT and delivery excellence in your implementation projects.
Please share if you have any feedback or comments on this page and if there are any other topics that you would like me to blog about and I would be more than happy to do so! Happy D365’ing!!
One thought on “Driving a successful implementation using Dynamics 365 Fin Ops tools – 2”