Let's walk through creating a standard Build VM
You are here
Using ATL to Create Data to Model Data Volume Testing
What can we do with ATL to create some transactions? Let's take a look.
After this post, I wanted to manufacture some data to better show that adding computed fields adds overhead for every operation within a data entity. That post somewhat showed it but I wanted something more cut and dry. I thought a good way to manufacture data in a Contoso environment would be use ATL, string together some ATL processes to create, pack, and invoice orders. In designing a way to do this, I had to fumble my way through how to do it. I wanted to add approximately 10 million processed orders. It didn't exactly work out but here is what I learned.
Legal Entities
Rather than specificially provide values for your known legal entities, you can create Demo Data Value Providers for your legal entities. This will allow you to create ATL values by legal entity without having to specifically know what they are in your test. You will have to source these values but once sourced, you can use the same tests in different legal entities using different Demo Data Value Providers in each legal entity. Below is an example and it is, sadly, not an HDMI cable:
class DemoDataValuesProviderUSMFInvent implements IDemoDataValuesProviderInvent
{
public ItemId itemDefault()
{
return '1000';
}}
Very simple but powerful. When running any ATL tests in legal entity USMF, when ATL asks for a default item number, value '1000'. Each provider has the legal entity code in it's name so you can create these providers by legal entity code (DataAreaId) as needed and build the same set of tests that run across multiple legal entities. You can also use DemoDataValuesForUSMF to provide other Legal Entity specific demo values that may not have a specific method available for them. Additionally, look at the class attribute SysTestCaseDataDependency. A few classes to check out are CostingSampleTest, InventCountingJournalSampleTest, as well as any DemoDataValuesProvider* class.
Number Sequences
One of the more interesting problems I ran into was number sequences being exhausted. Once I was able to get a single order processed and started scaling up, I hit a few hard stops. You can use this post to monitor your number sequences periodically and switch / change over to newer onces that have more numbers available. Using Contoso data, I would repeatedly run into issues with number sequences only being 6 digits and being completely exhausted after a few thousand orders. Number sequences for SO Numbers, Inventory Lots and parameter IDs for postings where the ones I modified to contain 10 digits.
On-Hand Inventory
Rather than journal in a large quantity of some item in each legal entity, I wanted the test to simply create the inventory needed so we knew it was there in the exact site / warehouse / location where we needed it. Since we could be transacting in any legal entity, I wanted my test to create then comsume the inventory in whichever legal entity the test was running in. This was simple to achieve but wanted to call it out. Model Acceptence Test Library - Sample Test has some additional tests and scenarios not covered in the MSFT Docs. Additionally, since I was using a DemoDataValueProvider* class, I didn't even have to know what the item number was. I just had to consisitently use the demo data provider.
Initial Test
I wanted to create an order, add a line with inventory dimension defaults, journal in on-hand for that item with those inventory dimensions, post the Sales Order Confirmation, post the Sales Order Packing Slip then the Sales Order Invoice for that order. Using a combination of MSFT Docs plus examples in classes InventCountingJournalSampleTest and SalesOrderSampleTest, I was able to create this one simple ATL test that does what i'm looking for.
public static void test()
{
int OrderQty = 1;var data = AtlDataRootNode::construct();
var warehouse = data.invent().warehouses().default();
var site = data.invent().sites().default();
var items = data.invent().items();
var onHand = data.invent().onHand();
var item = items.default();var salesOrder = data.sales().salesOrders().createDefault();
var salesLine = salesOrder.addLine().setItem(item).setInventDims([site,warehouse]).setQuantity(OrderQty).save();onHand.adjust().forItem(item)
.forInventDims([salesLine.inventDim().inventDim().InventSite(),
salesLine.InventDim().InventDim().InventLocation()])
.setQty(OrderQty).execute();salesOrder.postConfirmation();
salesOrder.postPackingSlip(SalesUpdate::All);
salesOrder.postInvoice();
}
Scale
The next issue was scale. Without collecting specific telemetry to support this, the above method could be executed in 1 company in 1 thread for as long as debugger window would let me. I think that was around 120 or 240 seconds. At some point, I realized I had to convert this over to be batch and also be multithreaded. Also, the performance was terrible. I'd get maybe 1 completely invoiced order (start to finish) every 4 seconds or so. Using sources like this, this, and mostly this, I was able to cull together something that I could scale to as much as batch would let me throw at it. Another item to consider is that when I have more than 1 thread, I can include more than 1 legal entity. However, as great as the Contoso data is, as well as the Demo Data Value Providers for Contoso data, I was only able to really test in 3 legal entities. Those legal entities were JPMF, USMF, and USRT. All other legal entities didn't have default data providers, had a functionality issue that was getting in the way or just things wouldn't post as expected. I was able to find this out through trial an error plus I started with Legal Entities that had Demo Data Values Providers.
Infinite Hyper Death
We now have something we can scale up and out, that will work in multiple legal entities and uses a common interface. Let's take a look at the result.
public static void main(Args _args)
{
BatchHeader batchHeader;
SysOperationServiceController controller;
Integer workItemsNumber = 1000000; //number of orders to create
Integer bundleSize = 1000; //number of orders per threadttsbegin;
batchHeader = BatchHeader::construct();
AAXSynthesizeDataBatchBundleDataContract bundlesDataContract;
batchHeader.parmCaption(strFmt('Synthesize Sales Order Posting Data'));
Integer minItemId = 1;
Integer maxItemId = bundleSize;while (minItemId <= workItemsNumber)
{
if (maxItemId > workItemsNumber)
{
maxItemId = workItemsNumber;
}controller = AAXSynthesizeDataBatchBundleController::construct(_args);
bundlesDataContract = controller.getDataContractObject();bundlesDataContract.parmMinWorkItemId(minItemId);
bundlesDataContract.parmMaxWorkItemId(maxItemId);
batchHeader.addTask(controller);minItemId = maxItemId + 1;
maxItemId = maxItemId + bundleSize;
}batchHeader.save();
ttsCommit;
info('Done');
}
This will create a batch that has 1 task for each 1000 orders we want to create. The batch system will handle this in parallel and create as many orders as it can, as fast as it can, in our supported legal entity. I let this run for a few weeks and unfortunately, we never did hit the 1 million order mark because other issues came up for one reason or another - some technical and some functional. However, I was able to create about 400k orders and I was able to prove out my original scenario from the OData computed fields post: having more data slows down computed columns. But, by how much? I'll post on that soon.
Source
All source code can be found here in project SynthesizeTestingData. This was part of my series on OData so there are lots of other C# projects in there as well.