How to use a recurring Integration Endpoint for importing data
You are here
Database Inserts - Transaction Data Entity comparison
Database Inserts - Transaction Data Entity comparison
In this article, we'll go through various performance related tests I ran against the data entity SalesOrderHeadersV2. In other articles, we ran tests against a custom entity for a code table, as well as a OOTB data entity Customers V2. each has a different usage profile because the underlying table or tables were very different. They had a different number of columns, indixes, extensions, event handlers, and delegates. Using those articles, we can see that as we ask x++ to do more stuff, it slows down - which makes sense. However, what amount of that slow down is from the data entity itself and its data model or just the interaction with the underlying database? Let's see if we can answer that question as well as look into what provides us the most performance when working with a data entity and treating it just like a table buffer.
Single Insert
The explanation for this test can be found here. We're looking at 4 different tests with insert loop sizes of 1, 10, 250, and 1000. We can see the "sweet spot" with this pattern is around 250 records and our dev box ( "Dev" - Azure Hosted ) performs much better than our LCS hosted Tier 2 Sandbox (TST). Overall performance is quite poor for this type of pattern.
MultiInsert
The explanation for this test can be found here. Same as before with the same test setup but we can see a slight improvement in performance across all workload sizes. We won't see huge improvements in performance compared to other tests or workloads because this type of table will almost always suffer from a "downgrade" - or conversion from a set based operation to a row based operation. More at https://learn.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/dev...
InsertRecordsetInsert
The explanation for this test can be found here. I couldn't get this to work with a data entity. I believe I received an error that the view was read-only.
RecordInsertListInsert
The explanation for this test can be found here. Same basic setup as before. However, we're seeing very different results. When using a RecordInsertList, we can get much improved performance for small batches - 10 or less. However, somewhere between 10 and 250, performance degrades quite a bit. I'm guessing but it seems likely there is some limit to the transaction size at the SQL level that can be stored in memory and at some point SQL has to perform the entire operation using much slow hard storage. Regardless, small batch RecordInsertList usage seem like a good idea for performance.
RecordSortedInsertList
The explanation for this test can be found here. Same setup as the other tests but this one is marginally faster than the Single Insert and Multi Insert tests. This doesn't have the same behavior as the RecordInsertLIst despite being a similar kind of construct.
SysDAInsertObject
The explanation for this test can be found here. I also couldn't get this to work with a data entity.
QueryInsertInsertRecordsetInsert
The explanation for this test can be found here. I also couldn't get this to work with a data entity.
Conclusion
Overall, using set based operators still produced better performance results than using any other construct and this is true for set of any size. Also, as a theme throughout the entire set of testing, the "single insert" pattern performed the worst. The chart below shows the "winner" and how much faster it can be under specific circumstances.