Designed by http://www.freepik.com/

Writeback – how low can you go?

I recently made a post about writing back to SQL from Power Apps using stored procedures, and one of the reasons for this, was the performance aspect. But what’s to say about performance, if no comparisons are actually made, except for gut feeling. The purpose of this post is to compare 4 different ways of generating new rows in a Database. 3 of them can be used in Sharepoint & Dataverse, and the fourth, the stored procedure approach, is exclusive for SQL.

Setting up the data

Generally, we’ll be working with 100 records at a time with the following structure:

SELECT  [rowid]
      ,[Customer_Key]
      ,[Q1Amt]
      ,[Q2Amt]
      ,[Q3Amt]
      ,[Q4Amt]
      ,[BudgetID]
      ,[CreatedBy]
  FROM [CustomerBudgetApp].[BudgetAmounts]

To generate the datapoints, we use a Sequence statement with a ForAll outside of it. It starts with a Clear statement to empty the collection. If we applied the “usual suspect” of ClearCollect, we would end with a single row, as we would be emptying the collection for every record in Sequence. A classic blunder.

Clear(col_Budgets);ForAll(Sequence(100, 1, 1), Collect(col_Budgets, {BudgetID:1, CreatedBy: "Skills",Customer_Key:ThisRecord.Value, Q1Amt: RandBetween(1000,20000), Q2Amt: RandBetween(1000,20000), Q3Amt: RandBetween(1000,20000), Q4Amt: RandBetween(1000,20000)}))

ForAll with Patch

The first Patch method we will apply, is a ForAll statement, with a Patch statement inside. When reading the code, you might realize that it actually supports both Inserts and updates, but for this post we will focus only on the Insert. Additionally it starts with initializing a Timer, so we can get an idea of the performance of it, and before forcing a refresh of our database, which generally is not nescessary, it will pause the Timer. That means we’re timing from the period at which Power Apps starts patching, to the time it is done patching. The potential cleanup we have afterwards, we will ignore.

Reset(Timer1);
Set(varStart, true);

ForAll(col_Budgets As CurRec, 
Patch('CustomerBudgetApp.BudgetAmounts', Coalesce( LookUp('CustomerBudgetApp.BudgetAmounts', Customer_Key = CurRec.Customer_Key && BudgetID = CurRec.BudgetID ), Defaults('CustomerBudgetApp.BudgetAmounts')), {BudgetID:CurRec.BudgetID, Q1Amt: CurRec.Q1Amt, Q2Amt: CurRec.Q2Amt, Q3Amt: CurRec.Q3Amt, Q4Amt: CurRec.Q4Amt, Customer_Key: CurRec.Customer_Key, CreatedBy: "Skills"})
)
;
Set(varStart,false);
Refresh('CustomerBudgetApp.BudgetAmounts');ClearCollect(col_Budgets, Filter('CustomerBudgetApp.BudgetAmounts', BudgetID = v_budgetkey));

The first attempt, with this approach, yields a time of 35,154, or 35 seconds and 154 milliseconds. That’s a pretty terrible time for patching just 100 records, even if it includes potential upserting.

Patch with ForAll

The next approach involves swapping the Patch and ForAll positions, so instead of nesting Patch, we will now be nesting ForAll inside the Patch statement. This should increase our performance by an incredible amount. But this approach does not support upserts, only inserts. So that’s the downfall of it, if any.

Reset(Timer1);
Set(varStart, true);
Patch('CustomerBudgetApp.BudgetAmounts', ForAll(col_Budgets As CurRec, {BudgetID:CurRec.BudgetID, Q1Amt: CurRec.Q1Amt, Q2Amt: CurRec.Q2Amt, Q3Amt: CurRec.Q3Amt, Q4Amt: CurRec.Q4Amt, Customer_Key: CurRec.Customer_Key, CreatedBy: "Skills"}));
Set(varStart, false);
Refresh('CustomerBudgetApp.BudgetAmounts');ClearCollect(col_Budgets, Filter('CustomerBudgetApp.BudgetAmounts', BudgetID = v_budgetkey));

The first attempt, with this approach, yields a time of 2,159, or 2 seconds and 159 milliseconds. That’s a much better time, and thereby a much better experience for potential users of the application. Using this approach, users might not close the application as they get bored of waiting for the process to complete.

Patching an entire collection

The third and final Patch approach, includes patching an entire collection. The expected performance of this is even higher than the 2 previous methods, while also supporting upserts. So basically, if you’re able to use this approach over the two previous ones, you should most definetly go this route. That is the expectation, let us see what the results say.

Reset(Timer1);
Set(varStart, true);
Patch('CustomerBudgetApp.BudgetAmounts', col_Budgets);
Set(varStart, false);
Refresh('CustomerBudgetApp.BudgetAmounts');ClearCollect(col_Budgets, Filter('CustomerBudgetApp.BudgetAmounts', BudgetID = v_budgetkey));

The first attempt, with this approach, yields a time of 2,107 or 2 seconds and 107 milliseconds. That’s a minor improvement, so no real reason to prioritize this approach over nesting ForAll inside Patch, except for the fact that this also supports upserts. So there’s a very logical reason as to why you should consider this approach instead.

The mythical stored procedure

The final approach, is the usage of a stored procedure as covered in a previous post.

Reset(Timer1);
Set(varStart, true);
skillsthrills.CustomerBudgetAppMergeCustomerBudget({Json: JSON(col_Budgets, JSONFormat.IndentFour), CreatedBy: "Skills"});
Set(varStart, false);
Refresh('CustomerBudgetApp.BudgetAmounts');ClearCollect(col_Budgets, Filter('CustomerBudgetApp.BudgetAmounts', BudgetID = v_budgetkey));

The first attempt, with this approach, yields a time of 302, or simply 302 milliseconds. The performance increase here is quite noticeable again, and also supporting upserts. The downfall of this approach, is that it only runs on SQL, and this approach between all of the above, is the only one that is completely dependent on refreshing the Data source, after a completed match. That operations also takes a bit of time.

Closing thoughts

Generally, the differences in performance seem to increase pretty fast. If we introduce a row count of 1,000 instead of 100, we would probably not want to try 5 times with ForAll Patch. The others would still perform fairly well, with the exception of the stored procedure not having a noticeable increase in time. Generally I would recommend using the approach of patching entire collections, as this works across sources, including Dataverse, Sharepoint and SQL, it supports upserts and the performance generally is acceptable.
If a single row is changed between our collection and the data source, the patch collections approach will outperform even the Stored Procedure. If we’re only creating records, and thereby not looking at updates at all, the ForAll nested inside Patch is a very interesting approach, that I believe few people actually know about.

Attempt No.ForAll PatchPatch ForAllPatch ColStored Proc.
1                                  35.154                                    2.159                                    2.107                                        302
2                                  50.954                                    2.059                                    4.804                                        205
3                                  23.703                                    2.731                                        969                                        164
4                                  60.000                                    1.480                                    1.324                                        153
5                                  38.210                                    1.598                                    1.948                                        132
Performance comparison. *Attempt 4, for ForAll Patch reached my maximum timer value of 60.000, it actually kept going for a while.

Leave a Comment

Your email address will not be published. Required fields are marked *