Quantcast
Channel: Microsoft Dynamics 365 and Power Platform Library
Viewing all 102 articles
Browse latest View live

DynamicsCon Recording - The Power of Dual Write

$
0
0

Excited to share that the recording of my session on "The Power Of Dual Write" is now available. 

Please watch and share your feedback.

Please feel free to ask questions - this way we all can learn more!!





Source of truth for your Dual-write questions, issues discussions, and get resolutions

$
0
0

If you happen to stuck with any issue with the Dual-write implementation from on how and where to start with dual-write - The source of truth is a yammer group https://www.yammer.com/dynamicsaxfeedbackprograms/#/threads/inGroup?type=in_group&feedId=16038053 runs by Microsoft Dual-write product team.

FILES tab in the group contains all A-Z documentation.

Get your questions answered today by various experts including MS dual-write product team.

You can even raise your live issue with them in the yammer group and they connect with you in no time - very much useful!!

Error resolution: Copying pre-existing data completed with errors. For additional details, go to initial sync details tab

$
0
0

If you happen to face this error message "Copying pre-existing data completed with error, For additional details, go to initial sync details tab" while starting the dual-write entity map then please check you have setup the cross company data sharing policy for the underline table of the entity map (which is a data entity in D365 FO).

To resolve this issue, Go to System Administration | Setup | Configure cross=company data sharing screen and look for policy where you may have setup the underline table of the data entity used for dual-write entity map. Disable the policy and then run the dual-write job again, wait until it turns to running state. Enable the cross-company data sharing policy again choose No on the next open pop-up window otherwise it will run against all data across all legal entitles which might not be necessary for this exercise as you just disabled it to start dual-write entity map.





What gets changed (Technical) in D365 Finance Operation with dual-write

$
0
0

Being a developer I have to explore how a new feature or a framework has been developed and how can this be extended when needed. While working on dual-write implementation I came across many technical difficulties as the framework was not mature enough at that time since the dual-write is only generally available in end of March 2020 and I have been working on dual-write since mid-year 2019.

I must say MS has done tremendous amount of work to make the integration with Microsoft Dynamics 365 Finance Operations apps and Common Data Services via Dual-write

Bunch of AOT objects have been added at D365 FO side - screen shot taken from 10.0.13 these can grow with future releases if needed.











Application class has also been modified for dual-write, below method gets called when the database transaction tries to get committed, between ttsbegin and ttscommit.

publicvoid ttsNotifyPreCommit()

    {

        if (!isInTtsNotifyPreCommit)

        {

            try

            {

                isInTtsNotifyPreCommit = true;

                if(sysTransactionScope)

                {

                    sysTransactionScope.onTtsCommitting();

                }

                   

                super();

                   

                if(this.canRaiseEvent())

                {

                    this.onTtsNotifyPreCommit();

                }

 

                // Checks that all suspension of recId allocation invoked by calling appl.SysRecIdSequence().suspendRecIds() have been removed.

                if(sysRecIdSequence)

                {

                    if (!this.sysRecIdSequence().areAllRecIdSuspensionsRemoved())

                    {

                        throw error("@SYS344764");

                    }

                    sysRecidSequence = null;

               }

 

               if (this.ttsLevel() == 1)

               {

                   DualWriteChangeProcessorBase::processTransactionPrecommit();

               }

            }

            finally

            {

                isInTtsNotifyPreCommit = false;

            }

        }

    }


Dual-write triggers only at ttsLevel 1, to understand this concept I did a small test.











From the trace log I can see that before ttscommit it goes to Application::ttsNotifyPreCommit and from there is makes a call to DualWriteChangeProcessorBase and in my scenario it goes to DualWriteSyncOutbound class.

























If you dig deeper in this class, this creates the complete JSON message and sync this to CE.

In my next post, I will show what tables, entities, classes are involved and how you can see what JSON message is going out from FO to CE. Stay tune.

Want to see the inbound and outbound Payload for dual-write integration

$
0
0

Let's continue where we left in first post "What got changed (Technical) in D365 Finance Operations with dual-write" our journey to understand what are the technical changes being done for dual-write framework.

This post we will look at what are highly used tables, entities, and classes involved for dual-write sync. I will only cover few tables, classes, and data entities but complete list of objects can be explored from AOT as shown below.











Business scenario: Create or update Vendor Groups in FO and sync over to CDS and vice versa.

Open Vendor groups screen - Accounts Payable > Vendors > Vendor groups







Out-of-the-box dual-write template is available for vendor groups





These fields are mapped between Finance and Operations apps and Common Data Service for Vendor groups dual-write template.

NOTE: only data will sync between apps only for mapped fields and the integration will also only trigger when we change data for these mapped fields in either side of the integration apps (FO or CDS) provided the sync direction is set to bi-directional (which is set for this scenario).








Requirement:You want to debug the integration (outbound or inbound)

Two important classes:

1. DualWriteSyncInbound

2. DualWriteSyncOutbound

If you want to debug data going out of FO to CDS (Outbound) then put a breakpoint at method WriteEntityRecordToCDS() of class DualWriteSyncOutbound.

This is quite extensive method and also gets a payload going out of FO to CDS, this payload can be watched and changed using all debugging features of D365 Finance and Operations.

Method BuildCDSPayload() being called inside method WriteEntityRecordToCDS() creates complete payload.

Pasting this just for reference purpose, this is subject to change at anytime so always refer to the updated code.

internalstr BuildCDSPayload(str cdsLookupUrls, common
entityRecord,
str fieldMappingJson, ICDSSyncProvider
syncProvider)



    {



        var executionMarkerUniqueIdentifier
= newGuid();



        DualWriteSyncLogger::LogExecutionMarker('DualWriteOutbound.BuildCDSPayload', true, strFmt('Execution start for record %1 with
uniqueId %2'
,
entityRecord.RecId, executionMarkerUniqueIdentifier));



        CDSPayloadGenerator payloadGenerator =
syncProvider.CDSPayloadGenerator;



          
responseContract.DualWriteProcessingStage =
DualWriteProcessingStage::TransformingSourceData;



       



          
FieldMappingIterator fieldMappingIterator = FieldMappingIterator::ReadJson(fieldMappingJson);



        if (fieldMappingIterator == null)



        {



           
responseContract.AddRecordResponse(ExecutionStatus::Failed, strFmt(
"@DualWriteLabels:InvalidFieldMapping",fieldMappingJson), '');



            DualWriteSyncLogger::LogSyncError('BuildCDSPayload', '', '',
strFmt(
'Failed to create CDS payload.
Error reason %1'
,
responseContract.GetFormattedResponseObject()),
DualWriteDirection::Outbound);



        }



        while
(fieldMappingIterator.MoveNext())



        {



            FieldMapping fieldMapping =
fieldMappingIterator.Current();



            var valueTranforms =  fieldMapping.ValueTransforms;



            var sourceValue = this.FetchSourceFieldDataFromMapping(entityRecord,fieldMapping);



            // If there is a value default value transform then
the payload creation is skipped



            // The payload gets added in CDSSyncProvider



            boolean
skipPayloadCreation =
false;



            if (valueTranforms != null)



            {



                var
transformEnum = 
valueTranforms.GetEnumerator();



                while
(transformEnum.MoveNext())



                {



                    IValueTransformDetails
transform = transformEnum.Current;



                    sourceValue = this.ApplyValueTransform(transform, sourceValue);



                   



                    skipPayloadCreation =
(syncProvider.GetProviderType() != CDSSyncProviderType::CDSQueueSync &&



                        transform.TransformType
== ValueTransformType::Default);



                    if (transform.HasTransformationFailed)



                    {



                       
responseContract.AddFailedFieldResponse(fieldMapping.SourceField,
strFmt(
"@DualWriteLabels:FailedTransform", sourceValue,
fieldMapping.SourceField, enum2Str(transform.TransformType)),
'');



                    }



                }



            }



            if (!strContains(fieldMapping.DestinationField,'.') || fieldMapping.IsSystemGenerated)



            {



                if
(!skipPayloadCreation)



                {



                   
payloadGenerator.AddAttributeValuePair(fieldMapping.DestinationField,



                            sourceValue,



                           
fieldMapping.IsDestinationFieldQuoted,



                            this.FetchSourceFieldTypeCode(entityRecord,
fieldMapping));



                }



            }



           
syncProvider.AddSourceColumnTransformedValue(syncProvider.GetMappingKey(fieldMapping.SourceField
,fieldMapping.DestinationField) ,sourceValue);



        }



       



       
responseContract.DualWriteProcessingStage =
DualWriteProcessingStage::ResolvingLookups;



        var cdsPayLoad =
syncProvider.BuildCDSPayloadForLookups(cdsLookupUrls, fieldMappingJson);



        DualWriteSyncLogger::LogExecutionMarker('DualWriteOutbound.BuildCDSPayload', false, strFmt('Execution ends for record %1 with
uniqueId %2'
,
entityRecord.RecId, executionMarkerUniqueIdentifier));



        return cdsPayLoad;



    }


To debug data coming in FO from CDS (Inbound) then put a breakpoint at method WriteDataToEntity() of class DualWriteSyncInbound.

Pasting this just for reference purpose, this is subject to change at anytime so always refer to the updated code.

private ResponseContract WriteDataToEntity(str entityName, str entityFieldValuesJson, str companyContext, booleanrunValidations = false, boolean isDelete = false, DualWriteTransactionId transactionId = '', str CDSSyncVersion = '', boolean isBatchCommit = true)

    {

        var executionMarkerUniqueIdentifier = newGuid();             

        this.InitializeInboundSync(entityName);        

        returnreponseContract;

    }


Tips and Tricks on how to validate your dual-write integration field-by-field level - its easy believe me 😊

$
0
0

After you are done with your entity field mappings for a required entity map e.g. for Vendor Groups, we started to look at in previous post to get the payload for dual-write integration, two records get created in the following tables.

1. DualWriteProjectConfiguration

2. DualWriteProjectFieldConfiguration

Records only get created when you Run the entity map and get deleted once you stopped it.

DualWriteProjectConfiguration table does not hold data for Vendor groups when the entity map is stopped.







DualWriteProjectFieldConfiguration table does not hold data for Vendor groups when the entity map is stopped.





Run the entity map and check data in these tables,



 




1. Both tables link each other using Name field

2. For each entity map (e.g. Vendor groups) there are two records; one for Insert and Update and the other one is for delete which ends with _end

Import information about DualWriteProjectConfiguration table:

1. External URL field holds the CDS environment's URL which is linked with FO in this format https://<CDS environment URL>/api/data/v9.0/msdyn_vendorgroups. This API URL can be browsed to see available data in CDS environment

2. Project partition map field holds list of legal entities linked for dual-write, you can copy this list in notepad++ (or any where) and check all legal entities from there too - this may help you in troubleshooting

3. IsDebugMod field if marked then failure records are stored in DualWriteErrorLog table

4. Filter query expression field if you have defined Finance and Operations apps filter for entity maps it stores here

Import information about DualWriteProjectFieldConfiguration table:

Filter records from this table based on the project name you have retrieved from DualWriteProjectConfiguration table as shown below as an example

1. External lookup urls field holds information how different lookups from CDS tables are linked for this integration.

Trick and Tip: Copy data from this field and paste into Notepad ++, data looks like this







Format JSON and you can review all data easily in JSON viewer








2. Field mapping field holds information on how fields are mapped between both apps (Finance Operations app and Common Data Service)

Trick and Tip: Copy data from this field and paste into Notepad ++, Format JSON and it shows like this

















DualWriteProjectConfigurationEntity data entity is the triggering point to sync data across based on pre-defined sync direction as discussed above.

In case you want to debug outbound call for vendor groups this will be the stack trace.












I hope this may have helped you to understand the tables and classed behind dual-write and troubleshoot your issue at some point. If you still have any question please reach out to me via comments section.


D365FO: 𝙇𝘾𝙎 𝙚𝙭𝙥𝙤𝙧𝙩𝙚𝙙 𝙗𝙖𝙘𝙥𝙖𝙘 𝙁𝙞𝙡𝙚 𝙘𝙤𝙣𝙩𝙖𝙞𝙣𝙨 𝙘𝙤𝙧𝙧𝙪𝙥𝙩𝙚𝙙 𝙙𝙖𝙩𝙖

$
0
0

If you experience below error message when trying to import bacpac file into tier 1 (DEV) environment then use the suggested steps to get over this error and get your database imported.

Importing to database 'AxDB_Daxture' on server '.'.

SqlPackage.exe : *** Error importing database:Could not load package from 'C:\Users\Adminb1c06345af\Downloads\daxturebackup.bacpac'.

At line:13 char:5

+     & $LatestSqlPackage $commandParameters "/p:CommandTimeout=0"

+     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    + CategoryInfo          : NotSpecified: (*** Error impor...backup.bacpac'.:String) [], RemoteException

    + FullyQualifiedErrorId : NativeCommandError

 

File contains corrupted data.


Based on this docs link https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/database/import-database, download the .NET CORE version of SqlPackage.exe.


This is the .zip file that can be extracted to C:\Temp\Sqlpackage-dotnetcore

From there, instead of using the Sqlpackage.exe under C:\Program Files (x86), use the Sqlpackage.exe in C:\Temp\Sqlpackage-dotnetcore.

Command will be:

C:\Temp\Sqlpackage-dotnetcore>SqlPackage.exe /a:import /sf:D:\Exportedbacpac\my.bacpac /tsn:localhost /tdn:<target database name> /p:CommandTimeout=1200


Dual-write integration for Cross-Company data in D365 Finance and SCM

$
0
0



Dual-write does not work with the cross company data sharing policies in D365 FinOps (there are so many names but I am using this name for reference 😊).

Brief overview about cross company data sharing policy first to set the base for the readers, the Cross-Company data sharing lets you have your data accessible from multiple legal entities (companies in D365 FinOps). For example, if you setup a policy for vendors to be crossed-company then whenever you create a new vendor it will be created in all data sharing legal entities.

https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/sysadmin/cross-company-data-sharing

Now, What happened when you a table under cross-company data sharing (e.g. VendTable) and want to sync vendors through dual-write?

Since, VendTable is one of the data sources for Vendors data entity (dual-write entity map) and we know dual-write does not work well with cross-company data sharing by design. You get following error message when you try to Run the entity map.

"Copying pre-existing data completed with errors.
For additional details, go to initial sync details tab."




The error message is confusing and does not reflect the actual issue behind the scene - you will never be able to figured it out what is wrong until you raise it with MS dual-write team and share the activity Id of the job with them to investigate the telemetry (you don't have access to check this one 😒) then they share the root cause.

However, You can also investigate by putting a breakpoint in method validateDataSharingEnabledForEntityTableBeforDualWriteEnable() of class SysDataSharingValidation.


/// <summary>

    /// Validates that cross company data sharing is not enabled when enabling Dual Write.

    /// </summary>

    /// <param name = "_entityName">The name of the entity containing the table being enabled</param>

    /// <param name = "_tableName">Table in entity</param>

    /// <param name = "_dataAreaId">Company info</param>

    [SubscribesTo(classStr(BusinessEventsRegistrationBase), staticdelegatestr(BusinessEventsRegistrationBase, onTableEnabled))]

    publicstaticvoidvalidateDataSharingEnabledForEntityTableBeforDualWriteEnable(str _entityName, str _tableName, DataAreaId _dataAreaId)

    {

        SysDataSharingOrganization sysDataSharingOrganizationTable;

        SysDataSharingRuleEnabledsysDataSharingRuleEnabledTable;

 

        selectfirstonly SharedTableName from sysDataSharingRuleEnabledTable

                wheresysDataSharingRuleEnabledTable.SharedTableName == _tableName

                    join DataSharingPolicy, DataSharingCompany from sysDataSharingOrganizationTable

                         where sysDataSharingRuleEnabledTable.DataSharingPolicy == sysDataSharingOrganizationTable.DataSharingPolicy

                &&  sysDataSharingOrganizationTable.DataSharingCompany == _dataAreaId;

 

        if (sysDataSharingRuleEnabledTable)

        {

            throwerror(strFmt("@DataSharing:CrossCompanySharingError", _entityName, sysDataSharingOrganizationTable.DataSharingPolicy));

        }

    }



Dual-write connection set error: An item with the same key has already been added

$
0
0

If you happen to see this error message then you have duplicate records in cdm_company entity in CDS environment. Check for cdm_companycode field this is normally not allowed but have a look and delete the ones with duplicates.



Steps to follow when refresh dual-write integrated environments (FO and CE)

$
0
0

Scenario: 

The D365 finance & operations app is linked to Common Data Services through dual-write in development or sandbox environments. You are required to refresh databases in D365 FO and CDS both from another production environment (dual-write integrated (linked) environments in the scenario)

Steps to follow:

  1. Before initial refresh, log on to target environment
  2. Go to Data management > Dual write and stop all jobs
  3. Refresh FO and CDS environments from the same integrated source environments
  4. Go to Data management > Dual write and Unlink environments
  5. Clean data from following tables in D365FO if data exists
  6. DualWriteProjectConfiguration
  7. DualWriteProjectFieldConfiguration
  8. Clean data from DualWriteRunTimeConfiguration Table in CDS
  9. Link Environments
  10. Apply MS dual-write solutions and your custom solution if there is any
  11. Start jobs and no need for initial sync, both environments are already sync as were copied from another dual-write integrated source environments




How to run dual-write table map when underline entity table is setup under cross-company data sharing policy

$
0
0

Scenario: 

Project groups are shared across all legal entities in D365 finance & operations app so have setup under one of the cross-company data sharing policies. You are also required to setup dual-write table map for project groups to sync from FO to CDS. However, you get following error message when you try to Run the table map

"Copying pre-existing data completed with errors.

For additional details, go to initial sync details tab."

Follow these steps to overcome this issue as a workaround. 

Steps to follow:

  1. Disable cross-company data sharing policy where ProjGroup table has been used
  2. Choose Yes at next pop up window
  3. Go to Data management > Dual write > select Project Groups table map and Run
  4. Enable cross-company data sharing policy for project groups
  5. Choose No at the next pop up window unless you want to copy data across all companies

D365FO: Entity cannot be deleted while dependent Entities for a processing group exist. Delete dependent Entities for a processing group and try again.

$
0
0

Scenario:

There are times when you want to delete an entity from target entity list and when you do so, you face an error message which does not tell you where exactly the entity has been used. 

"Entity cannot be deleted while dependent Entities for the processing group exist. Delete dependent Entities for a processing group and try again."

Solution:

Browse the environment by appending this part /?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF at the end. 

For example; if the environment URL is https://daxture.sandbox.operations.dynamics.com then the complete URL will be https://daxture.sandbox.operations.dynamics.com/?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF

Filter for Entity and it will give you the DefinitionGroup where the entity has been added or used in data management import/export projects.

Get the DefinitionGroup name and search in the export/import projects under data management and either delete the whole project or remove the entity from the group. Try deleting/removing entity from target entity list and it should be good now.







Another step closer - Finance Operations data in Power Platform - Virtual Entities

$
0
0

This post focuses on the integration technologies available to have the Microsoft Dynamics 365 Finance Operations data available in Dataverse/Common Data Services/CDS. What could be better then having The biggest ERP system's data in Power Platform. You can Power Portals, Power Apps, Power BI analytical reports, use power virtual agents for inventory closing and year end closing processes, manage expenses and employee/contractors time entry processes, most of these processes can be even without logging to MS ERP (Dynamics 365 Finance Operation) so can safe on license cost too. 

Let's see what options are available to integrate F&O data with Power Platform however, this post is dedicated to Virtual Entities. 

3 Options available out-the-box to integrate F&O data with Power Platform;

👉 Data Integrator - Click on link to read more

👉 Dual-Write - Click on link to read more

👉 Virtual Entities - MS Tech Talk on Virtual entities 




Before we jump to the installation and configuration part, let's see when were the virtual entities available and what features these have come up with compared to other two integrations technologies.

Virtual Entities Generally available

✔️ Finance and Supply Chain Management App: 10.0.12 or later

✔️ Dataverse: Service update 189 or later

Virtual Entities features

✔️ Finance and Operations is available as a data source in Dataverse

✔️ No Finance and Operations data replication in Dataverse

✔️ Access all public data entities of Finance and Operations in Dataverse

✔️ Support all CRUD operations

Install Virtual Entities solution

Head to this link https://appsource.microsoft.com/en-us/product/dynamics-365/mscrm.finance_and_operations_virtual_entity and Get it Now










Enter your work or school account and Sign in

Choose the environment where you want to install this solution













Wait for finish to installing









Finance and Operations Virtual Entity solution shows as Enabled









Finance and Operations Virtual Entity solution is installed successfully - Hurray!! that was easy








Register an App in Azure Active Directory

The AAD application must be created on the same tenant as F&O.

  1. Log on to http://portal.azure.com
  2. Azure Active Directory > App registration
  3. New Registration


    Define these attributes
    1. Name
    2. Account type
    3. Redirect URI - leave blank
    4. Select Register
    5. Make note of the Application (Client) ID, you will need it later


Register an App



Create a symmetric key for the application, Save and note it for later use.


Steps to follow in Dataverse environment 

Log on to Dataverse environment and click on Advance settings


Go to Administrator













Choose Virtual entity data sources


Finance and operations is available as of the data source in Dataverse




















Click on Finance and Operations and following screen pops up, this is where the connections established


























Configuration in Finance and Operations

  1. Log on to Finance and Operations and go to System Administration | Users | Users
  2. Create a new user and assign 'CDS virtual entity application' role to it - don't assign system admin role to this user - This user is used to look at the metadata of the data entities from the Dataverse instance.
  3. Enter Application Id in System Administration | Setup | Azure Active Directory applications screen with the User ID = <The user created in step 1>


Test Finance and Operations data in Dataverse

Log on to Dataverse instance and click on a little funnel to open advance find and look for 'Available Finance and Operations Entities' in the list of tables in Dataverse instance. 


















By default not all the entities are enabled this is to avoid cluttering the user experience in Dataverse but individual entities can be enabled e.g. I enabled DataManagementDefinitionGroupEntity and mark visible to make this as a virtual entity in Dataverse.















To illustrate this example, I created an export data project in Finance and Operations under Data Management with the name 'CDSVirtualEntitiesExport' - The data entity behind this data export projects is DataManagementDefinitionGroupEntity which is marked as virtual entity in above step. 
















Restart the Advance find in Dataverse instance and look for Definition Group (mserp) table map and Run to see the output












This is it for today, with next post I will explain how to do customization/extension in F&O and get data into Dataverse using Virtual Entities. 

Hope you may have enjoyed the post, please do provide your feedback. Enjoy your break!!


D365FO: Right click on any control at D365FO browser takes you directly to the control in AOT

$
0
0

Last week I explored very Interesting feature especially for developers where you right click on any field/control on the form and follow these steps.












This opens the visual studio in non admin mod, Opens the correct form, and takes you directly at the control in AOT.

NOTE: You can only get this feature within Development VM where your browser and Visual Studio are in same machine. I am at 10.0.14 but not sure when this great feature was available first :(



Dual-write learning series - Dual-write initial sync is the data integrator

$
0
0

One of the features of the dual-write is initial sync where you copy data from the source app (Finance Operation OR DataVerse) to the target app (Finance Operation OR DataVerse) depending on the selection in Master for initial sync option. 

This initial sync is the Data Integrator service running behind the scene and copies your data over. You configure the application id for data integrator and add it both apps (Finance Operation OR DataVerse), I have documented it my previous post The Dual Write implementation - Part 2 - understand and action pre-requisites

Master for initial sync can be either Common Data Service (Dataverse) or Finance and Operations apps. For example, If I choose Finance and Operations app in below example where I am syncing Functional Locations then all records will be copied from Finance and operations to Dataverse.








Initial Sync is a full push means if an individual row fails to sync, you cannot resync only failed ones. If the initial synchronization only partially succeeds, a second synchronization runs for all the rows, not just the rows that failed to be synced during the initial synchronization.

For example;

1st initial sync Run for 1000 records from FO to CDS à 700 passed and 300 failed

2nd initial sync Run will again run for 1000 records 

Do check Considerations for initial Sync from Microsoft Docs 

Initial Sync runs against all legal entities configured for dual-write. If you have entered a filter for a specific legal entity in a table map at Finance and Operations app side, as shown below as an example, this will not work for initial sync as it will run against all legal entities configured for dual write under environment details.




MS D365 FinOps database sync failed with an error in SECURITYROLEPRIVILEGERESOURCELICENSEMAP

$
0
0

Scenario:

Database synchronization failed with following error message. 

It happened after I refreshed the tier 1 (DEV) environment of MS D365 FinOps from a sandbox environment database. The original database was named to AxDB_Orig as part of this refresh process.

Severity Code Description Project File Line Suppression State
Error ON JMAP.PRIVILEGEIDENTIFIER = SP.IDENTIFIER AND JMAP.ISUNIQUE = 1 0
Error LEFT JOIN SECURITYMENUITEMLICENSES RL 0
Error at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName, Boolean async, Int32 timeout, Boolean asyncWrite) 0
Error INSERT INTO SECURITYROLEPRIVILEGERESOURCELICENSEMAP 0
Error at Microsoft.Dynamics.AX.Framework.Database.Tools.SyncEngine.Run(String metadataDirectory, String sqlConnectionString, SyncOptions options) 0
Error SP.RECID, 0
Error RP.AOTNAME, 0
Error at Microsoft.Dynamics.AX.Framework.Database.Synchronize.InitialSchemaSync.ScriptRegion.ExecuteCommand(SqlCommand cmd) 0
Error ON RL.IDENTIFIER = RP.AOTNAME 0
Error at Microsoft.Dynamics.AX.Framework.Database.Tools.SyncEngine.RunSync() 0
Error LEFT JOIN SECURITYROLEPRIVILEGEEXPLODEDGRAPH PRMAP 0
Error END, 0
Error \n\nException message: System.Data.SqlClient.SqlException (0x80131904): Column name or number of supplied values does not match table definition. 0
Error FROM SECURITYPRIVILEGE SP 0
Error -- MAINTAINLICENSE 4 is Operations. We update MAINTAINLICENSE only if existing MAINTAINLICENSE is operations and set to Finance (8), SCM (9), Retail (10) 0
Error [MAINTAINLICENSE] = 0
Error ON PRMAP.SECURITYPRIVILEGE = SP.RECID 0
Error SP.IDENTIFIER, 0
Error --cleanup before populating data 0
Error \nCREATE PROCEDURE [DBO].[LICENSING_POPULATEROLEPRIVILEGELICENSEMAP] 0
Error --- End of inner exception stack trace --- 0
Error AS 0
Error at Microsoft.Dynamics.AX.Framework.Database.Tools.SyncEngine.InitialSchemaSync() 0
Error at Microsoft.Dynamics.AX.Framework.Database.Tools.StaticSchema.RunStaticUpdate(String sqlConnectionString, String binDir, Boolean skipRegionHashing) 0
Error at Microsoft.Dynamics.AX.Framework.Database.Tools.SyncEngine.FullSync() 0
Error CASE WHEN JMAP.SKUNAME = 'Finance' AND RL.MAINTAINLICENSE = 4 THEN 8 0
Error TRUNCATE TABLE SECURITYROLEPRIVILEGERESOURCELICENSEMAP 0
Error LEFT JOIN LICENSINGSERVICEPLANSPRIVILEGE JMAP 0
Error RL.VIEWLICENSE 0
Error at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady) 0
Error END; 0
Error BEGIN 0
Error at Microsoft.Dynamics.AX.Framework.Database.Synchronize.InitialSchemaSync.RunSync() 0
Error SELECT DISTINCT PRMAP.SECURITYROLE, 0
Error at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction) 0
Error at Microsoft.Dynamics.AX.Framework.Database.Synchronize.InitialSchemaSync.ScriptRegion.Execute(SqlConnection connection, SqlTransaction transaction) 0
Error RL.MENUITEMTYPE, 0
Error Initialize schema failed. Microsoft.Dynamics.AX.Framework.Database.TableSyncException: Failed during InitialSchema at command: 0
Error ClientConnectionId:692f02f7-8690-4127-9da2-c59ead0da371 0
Error ON SP.IDENTIFIER = RP.PRIVILEGEIDENTIFIER AND SECURABLETYPE IN (1,2,3) 0
Error WHEN JMAP.SKUNAME = 'SCM' AND RL.MAINTAINLICENSE = 4 THEN 9 0
Error WHEN JMAP.SKUNAME = 'Retail' AND RL.MAINTAINLICENSE = 4 THEN 10 0
Error ELSE RL.MAINTAINLICENSE 0
Error at System.Data.SqlClient.SqlCommand.ExecuteNonQuery() 0
Error -- While updating we also consider Privilege in LicensePrivileges.json is unique. 0
Error at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) 0
Error WHERE RL.MENUITEMTYPE IS NOT NULL 0
Error Error Number:213,State:1,Class:16 ---> System.Data.SqlClient.SqlException: Column name or number of supplied values does not match table definition. 0
Error JOIN SECURITYRESOURCEPRIVILEGEPERMISSIONS RP 0
Error at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(TaskCompletionSource`1 completion, String methodName, Boolean sendToPipe, Int32 timeout, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry) 0
Error syncengine.exe exited with code -1. 1   

Resolution:

It was the number of columns mismatch between the source and the target databases in SECURITYROLEPRIVILEGERESOURCELICENSEMAP

Executed this command against AxDB and AxDB_Original databases. 

sp_columns SECURITYROLEPRIVILEGERESOURCELICENSEMAP

Compared the number of columns in between both DBs and found AxDB has an extra column added as part of the refresh from sandbox database.

Open Object explorer and removed the additional column from SECURITYROLEPRIVILEGERESOURCELICENSEMAP



Synchronize the database in MS D365 FinOps and it completed successfully.

MS D365 FinOps: How to create new LCS project and deploy Tier 1 (DEV) VM - Even you are not a MS customer or partner :) - Part I

$
0
0

Scenario:

You are willing to work on Microsoft Dynamics 365 Finance Operations product and want to get your hand dirty with some development. You are hearing so much about this product but never got a chance to work on this MS ERP. 

Andre wrote a detailed post on how you can setup a trial environment for MS D365 Finance Operations

Solution:

With this post I will explain a step-by-step guide on how;

  1. to create your own LCS project
  2. to deploy a new Tier 1 VM
  3. to log on to Azure Portal
  4. to access Power Platform environments
  5. to create your own power app and use other features
  6. to deploy solutions in Power Platform to integrate with Finance Operations

NOTE: You would need Azure Subscription to deploy VM

First create new domain to perform all above steps - its easy just follow following steps

1. Open this site Office 365 E3 in cognitive mode or as guest and go with Free Trial option









2. Fill in details - you can use your personal email or sign up for a new email account and use that one












3. Provide as much as information you can - it will be good for you :)














4. Choose verification method, I always select text me 















5. Provide Once verified enter your business details and check the availability
























6. Sign up and you are ready to use this account to perform all above steps mentioned under solution section











Manage your subscription option will take you to Microsoft 365 admin center where you will have 25 free user license for the whole month. You can use this account to sign up for teams and enjoy all features for the whole month FREE!!










Log on to LCSlcs.dynamics.com using above created account

Create new project by clicking on + sign and fill information - product name should Finance and Operations










Your LCS project is ready, click on hamburger sign and go to Project setting













Under Organization and ownership the type should be either customer or partner which will be one of them based on the account you have used to log on to LCS. If your account is linked to a partner organization then this will be partner and it will be customer if your account is of type customer.

Remember, the created account in post is not linked to either Partner or Customer so we cannot deploy any tier 1 environment in LCS as in order to connect to Azure portal with LCS the company account should be either customer or partner. 

So, we are blocked here :(


Here is the trick to convert this prospect account to customer account to unblock ourselves. 

Browse https://trials.dynamics.com/ and choose Finance and Operations, enter your new account and hit Get Started. This will deploy a new trial environment with demo data in next 30 minutes. Read Andre's post to find the downside of this environment.















After trial environment deployed, refresh project settings page (you can sign out and sign in again in) to see the changes where it has changed type from prospect to customer











Now you are the customer so let's continue our journey of completing our solution but this is it for this post and we will continue deploying cloud hosted environment through LCS in azure portal in next post.

D365FO - How to read metadata information from AOT through X++

$
0
0
Following code snippet loop through metadata information of AOT objects (this example is to loop through all data entities).


classDataEntityExtension
{
    /// <summary>
    /// Runs the class with the specified arguments.
    /// </summary>
    /// <param name = "_args">The specified arguments.</param> 
    publicstaticvoid main (Args _args)
    {
        DMFEntityTableExtension entityTableExtension;

        var entityNames = Microsoft.Dynamics.Ax.Xpp.MetadataSupport::GetDataEntityViewNames();
        Var enumerator = entityNames.getEnumerator();
       
        while (enumerator.MoveNext())
        {
            enumerator.MoveNext();

            Microsoft.Dynamics.AX.Metadata.MetaModel.AxDataEntityView axDataEntity = Microsoft.Dynamics.Ax.Xpp.MetadataSupport::GetDataEntityView(enumerator.Current);

            if (axDataEntity)
            {                
                    entityTableExtension.Label = SysLabel::labelId2String(axDataEntity.Label);
                    entityTableExtension.PublicEntityName = axDataEntity.PublicEntityName;
                    entityTableExtension.PublicCollectionName = axDataEntity.PublicCollectionName;
                    entityTableExtension.EntityName = axDataEntity.Name;
                    entityTableExtension.IsPublic = axDataEntity.IsPublic;
                    entityTableExtension.insert();

                    info(strFmt("Entity Name: %1 -- Entity Label; %2 -- Entity public collection Name: %3 -- Entity Public Name %4 -- Is Entity Public - %5",
                                axDataEntity.Name,
                                SysLabel::labelId2String(axDataEntity.Label),
                                axDataEntity.PublicCollectionName,
                                axDataEntity.PublicEntityName,
                                axDataEntity.IsPublic));
               
            }
           
        }
    }

}

OUTPUT:





Get your Dynamics 365 FO tier 2 (sandbox) environment today!!

$
0
0

Pakistan User Group is hosting FREE training program for everyone covering Microsoft Business Application and Azure components of Microsoft EcoSystem from beginners to advance level. Register now if you have not yet and join us on Saturday 20th November at 4pm Pakistan Standard Time (GMT + 5).

All details apart, this post is a quick guide to get your own Microsoft Dynamics 365 Finance Operations tier 2 environment FREE!! I will create step-by-step videos to explain all these steps in details, I know it requires detailed explanation. Subscribe https://www.youtube.com/c/DaxtureD365FO 

Let's begin...

Open this URL https://dynamics.microsoft.com/en-au/intelligent-order-management/overview/?ef_id=e0b92d13d85e177270894c83385bd79c:G:s&OCID=AID2200017_SEM_e0b92d13d85e177270894c83385bd79c:G:s&msclkid=e0b92d13d85e177270894c83385bd79c and click on Request a demo and sign up now

















Enter work or school email address (create new if you don't have one - this can be gmail or hotmail account so don't worry too much - it should be your and valid email as you will receive email confirmation on this account), upon entering your email account it will ask you set up a new account. 

Complete all steps and verify your account either via email or SMS.















Get Started 

Choose region on next screen and Submit











Log on to Lifecycle Services https://lcs.dynamics.com with an account you created above (e.g. I created this account 1DynamicsDaxture@1dynamics675.onmicrosoft.com). First time you will following screen 












Click on + sign to create new project (Select product of your choice - I have chosen Finance and Operations) 








Project is created, click on Project onboarding  and follow the documentation to complete project onboarding. This is a must step before environments will get deployed. Comment to discuss more about this process with me.
















Upon project onboarding completion, the configure button will be enabled (note for this example, I have not completed the onboarding process hence the configure button is disabled)

Configure new environment following MS docs article - any question again ping me directly. This will take less than an hour to deploy new sandbox (tier 2) environment for Finance Operations. This also creates a new environment in power platform, check it from this URL https://make.powerapps.com/environments
Sign up for FREE on portal.azure.com for 1 month using same account :)
I know there are steps require more explanation, I will create short videos on all steps and share. Stay Tune!! 


Download large bacpac (sandbox database) to DEV environment much faster

$
0
0

As the LCS website gets slower and slower and the database backups get bigger and bigger. Use AZCopy to download objects out of LCS asset library. it is an incredibly quickly vs manually downloading the files (>1min for a gig vs 1 hour+)








Download AZCopy to the environment (https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10?toc=/azure/storage/files/toc.json#download-azcopyand user the PowerShell command: .\azcopy copy "LCS SAS Link""LocalPath"

D The only issue I noticed is that the local path had to be into a folder, not the root of the drive (so "C:\Temp" not "C:\" which is more related to windows security then anything else.

Below is the example:

Extract AzCopy zip folder to C:\Temp folder



Extract It took 3 minutes to download almost 18 GB of data file - WOW feeling :)


Viewing all 102 articles
Browse latest View live