How to use a recurring Integration Endpoint for importing data
You are here
Using ATL with Local Telemetry
Let's collect some telemetry when using ATL in batch.
Following on up this article, I thought it would be interesting to collect some telemetry on what happens when we use ATL to create a bunch of activity. First, let's review what we were doing with ATL in the referenced article:
- Create a Sales Order with 1 line
- Adjust in inventory for the 1 SO line
- Confirm the Sales Order
- Packing Slip the Sales Order
- Invoice the Sales Order
- Do that approximately 1 billion times ( or until we hit some hard limit, like number sequences being exhausted )
- This is done in a developer environment
So we'll want to collect record counts by table and also by table for a given legal entity. First, let's create a database to store this data (alter file paths for your scenario):
USE [master]
GO/****** Object: Database [telemetry] Script Date: 11/19/2023 7:40:41 PM ******/
CREATE DATABASE [telemetry]
CONTAINMENT = NONE
ON PRIMARY
( NAME = N'telemetry', FILENAME = N'G:\MSSQL_DATA\telemetry.mdf' , SIZE = 8192KB , MAXSIZE = UNLIMITED, FILEGROWTH = 65536KB )
LOG ON
( NAME = N'telemetry_log', FILENAME = N'H:\MSSQL_LOGS\telemetry_log.ldf' , SIZE = 73728KB , MAXSIZE = 2048GB , FILEGROWTH = 65536KB )
WITH CATALOG_COLLATION = DATABASE_DEFAULT
GOIF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled'))
begin
EXEC [telemetry].[dbo].[sp_fulltext_database] @action = 'enable'
end
GOALTER DATABASE [telemetry] SET ANSI_NULL_DEFAULT OFF
GOALTER DATABASE [telemetry] SET ANSI_NULLS OFF
GOALTER DATABASE [telemetry] SET ANSI_PADDING OFF
GOALTER DATABASE [telemetry] SET ANSI_WARNINGS OFF
GOALTER DATABASE [telemetry] SET ARITHABORT OFF
GOALTER DATABASE [telemetry] SET AUTO_CLOSE OFF
GOALTER DATABASE [telemetry] SET AUTO_SHRINK OFF
GOALTER DATABASE [telemetry] SET AUTO_UPDATE_STATISTICS ON
GOALTER DATABASE [telemetry] SET CURSOR_CLOSE_ON_COMMIT OFF
GOALTER DATABASE [telemetry] SET CURSOR_DEFAULT GLOBAL
GOALTER DATABASE [telemetry] SET CONCAT_NULL_YIELDS_NULL OFF
GOALTER DATABASE [telemetry] SET NUMERIC_ROUNDABORT OFF
GOALTER DATABASE [telemetry] SET QUOTED_IDENTIFIER OFF
GOALTER DATABASE [telemetry] SET RECURSIVE_TRIGGERS OFF
GOALTER DATABASE [telemetry] SET DISABLE_BROKER
GOALTER DATABASE [telemetry] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
GOALTER DATABASE [telemetry] SET DATE_CORRELATION_OPTIMIZATION OFF
GOALTER DATABASE [telemetry] SET TRUSTWORTHY OFF
GOALTER DATABASE [telemetry] SET ALLOW_SNAPSHOT_ISOLATION OFF
GOALTER DATABASE [telemetry] SET PARAMETERIZATION SIMPLE
GOALTER DATABASE [telemetry] SET READ_COMMITTED_SNAPSHOT OFF
GOALTER DATABASE [telemetry] SET HONOR_BROKER_PRIORITY OFF
GOALTER DATABASE [telemetry] SET RECOVERY FULL
GOALTER DATABASE [telemetry] SET MULTI_USER
GOALTER DATABASE [telemetry] SET PAGE_VERIFY CHECKSUM
GOALTER DATABASE [telemetry] SET DB_CHAINING OFF
GOALTER DATABASE [telemetry] SET FILESTREAM( NON_TRANSACTED_ACCESS = OFF )
GOALTER DATABASE [telemetry] SET TARGET_RECOVERY_TIME = 60 SECONDS
GOALTER DATABASE [telemetry] SET DELAYED_DURABILITY = DISABLED
GOALTER DATABASE [telemetry] SET ACCELERATED_DATABASE_RECOVERY = OFF
GOALTER DATABASE [telemetry] SET QUERY_STORE = OFF
GOALTER DATABASE [telemetry] SET READ_WRITE
GO
Next, let's create a table to store some telemtry data in:
USE [telemetry]
GO/****** Object: Table [dbo].[recordCountTable] Script Date: 11/19/2023 7:42:36 PM ******/
SET ANSI_NULLS ON
GOSET QUOTED_IDENTIFIER ON
GOCREATE TABLE [dbo].[recordCountTable](
[id] [bigint] IDENTITY(1,1) NOT NULL,
[tableName] [varchar](60) NOT NULL,
[recordCount] [bigint] NOT NULL,
[dateTime] [datetime] NOT NULL,
[dataareaid] [nvarchar](4) NULL,
PRIMARY KEY CLUSTERED
(
[id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GOALTER TABLE [dbo].[recordCountTable] ADD PRIMARY KEY CLUSTERED
(
[id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
GO
CREATE NONCLUSTERED INDEX [tableNameIdx] ON [dbo].[recordCountTable]
(
[tableName] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
GO
CREATE NONCLUSTERED INDEX [DataAreaIdIdx] ON [dbo].[recordCountTable]
(
[dataareaid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
GO
Now we have a place to store some time series telemetry. We'll be tracking the table name from SQL, the record count for the table or a given legal entity, the date time and time for the count plus optionally the legal entity for that scan. Next, we need to actually setup collecting that telemetry. First, we'll need some SQL to write the SQL we'll need to execute for our telemetry collection job:
create table #tmpTables (
tableName nvarchar(60)
)create table #tmpDataAreaIds (
dataAreaId nvarchar(4)
)--Sales orders
insert into #tmpTables values('SalesTable')
insert into #tmpTables values('SalesLine')
--confirmations
insert into #tmpTables values('CustConfirmJour')
insert into #tmpTables values('CustConfirmTrans')
--packing slips
insert into #tmpTables values('CustPackingSlipJour')
insert into #tmpTables values('CustPackingSlipTrans')
--invoices
insert into #tmpTables values('custInvoiceJour')
insert into #tmpTables values('CustInvoiceTrans')insert into #tmpDataAreaIds
select id
from axdb.dbo.DATAAREA
where id <> 'dat'
order by 1-- templates
--insert into recordCountTable(tableName, recordCount, dateTime) values('CustInvoiceJour', (select count(1) from axdb.dbo.CustInvoiceJour), GETDATE())
--insert into recordCountTable(tableName, recordCount, dateTime, dataareaid) values('CustInvoiceJour', (select count(1) from axdb.dbo.CustInvoiceJour where DATAAREAID = 'usmf'), GETDATE(), 'usmf')select 'insert into recordCountTable(tableName, recordCount, dateTime, dataareaid) values('''+ tableName + ''', (select count(1) from axdb.dbo.' + tableName + ' where DATAAREAID = ''' + #tmpDataAreaIds.dataAreaId + '''), GETDATE(), ''' + #tmpDataAreaIds.dataAreaId + ''')'
from #tmpTables
cross join #tmpDataAreaIds
union
select 'insert into recordCountTable(tableName, recordCount, dateTime) values('''+ tableName + ''', (select count(1) from axdb.dbo.' + tableName + '), GETDATE())'
from #tmpTablesdrop table #tmpTables
drop table #tmpDataAreaIds
This will write something like this:
insert into recordCountTable(tableName, recordCount, dateTime) values('CustConfirmJour', (select count(1) from axdb.dbo.CustConfirmJour), GETDATE())
insert into recordCountTable(tableName, recordCount, dateTime) values('CustConfirmTrans', (select count(1) from axdb.dbo.CustConfirmTrans), GETDATE())
insert into recordCountTable(tableName, recordCount, dateTime) values('custInvoiceJour', (select count(1) from axdb.dbo.custInvoiceJour), GETDATE())
insert into recordCountTable(tableName, recordCount, dateTime) values('CustInvoiceTrans', (select count(1) from axdb.dbo.CustInvoiceTrans), GETDATE())
insert into recordCountTable(tableName, recordCount, dateTime) values('CustPackingSlipJour', (select count(1) from axdb.dbo.CustPackingSlipJour), GETDATE())
insert into recordCountTable(tableName, recordCount, dateTime) values('CustPackingSlipTrans', (select count(1) from axdb.dbo.CustPackingSlipTrans), GETDATE())
insert into recordCountTable(tableName, recordCount, dateTime) values('SalesLine', (select count(1) from axdb.dbo.SalesLine), GETDATE())
insert into recordCountTable(tableName, recordCount, dateTime) values('SalesTable', (select count(1) from axdb.dbo.SalesTable), GETDATE())
insert into recordCountTable(tableName, recordCount, dateTime, dataareaid) values('CustConfirmJour', (select count(1) from axdb.dbo.CustConfirmJour where DATAAREAID = 'BRMF'), GETDATE(), 'BRMF')
insert into recordCountTable(tableName, recordCount, dateTime, dataareaid) values('CustConfirmJour', (select count(1) from axdb.dbo.CustConfirmJour where DATAAREAID = 'CNMF'), GETDATE(), 'CNMF')
insert into recordCountTable(tableName, recordCount, dateTime, dataareaid) values('CustConfirmJour', (select count(1) from axdb.dbo.CustConfirmJour where DATAAREAID = 'DEMF'), GETDATE(), 'DEMF')
insert into recordCountTable(tableName, recordCount, dateTime, dataareaid) values('CustConfirmJour', (select count(1) from axdb.dbo.CustConfirmJour where DATAAREAID = 'FRRT'), GETDATE(), 'FRRT')
insert into recordCountTable(tableName, recordCount, dateTime, dataareaid) values('CustConfirmJour', (select count(1) from axdb.dbo.CustConfirmJour where DATAAREAID = 'FRSI'), GETDATE(), 'FRSI')
insert into recordCountTable(tableName, recordCount, dateTime, dataareaid) values('CustConfirmJour', (select count(1) from axdb.dbo.CustConfirmJour where DATAAREAID = 'GBPM'), GETDATE(), 'GBPM')
insert into recordCountTable(tableName, recordCount, dateTime, dataareaid) values('CustConfirmJour', (select count(1) from axdb.dbo.CustConfirmJour where DATAAREAID = 'GBSI'), GETDATE(), 'GBSI')
And so on. We'll simply want to copy/paste that into Notepad or something similar for the time being. Next, we have to tell SQL to collect the data for us. This will collect record counts for the table used for all of the activity mentioned above for the entire instance as well as for each legal entity in the instance. This is focused on Contoso data but will work with your data as well. First, go into Services on your dev box. Find service "SQL Server Agent" then start it. Next, set the start up type to be "automatic (delayed)". Next, in SSMS, expect QL Server Agent on the left hand side. Right click on jobs and click on "New Job..."
Give the job a name, click on steps.
Click New...
Give the Step a name ( Step 1 is fine ), then paste in the SQL that was generated earlier. Click ok.
Next click on Schedules, then click New...
Give the Schedule a name, like "Hourly", set Occurs to Daily, set Occurs Every to 1 Hours with no end date.
Click OK then Ok. Right Click on the job and click "start job at step" to confirm it works correctly.
After that, just start the batch job(s) for your bulk activities and collect some telemetry over the next few days. I let this run for about 14 hours or so, and this is what I found:
By hour, we were able to add about an average approximately 15K SO's and process them from creation to invoice. We're specifically looking only at Sales Order Headers (SalesTable Records) but we can look at other telemetry to confirm my assertion that we're looking at fully processes SO's.
When looking at the number of SO's added by hour, we can see a downward trend on volume. This is likely related to the amount of data being added to SQL / the system that isn't being index optimized so as we process more and more, our indexes get more and more fragmented making SQL perform worse.
What Can We Do This This?
It depends. I'll be extending this project and adding some recording capabilities to it so we can have a dashboard to review performance of ATL testing as a pre-commit function - just to see if we did anything that would slow things down. DevOps can capture some of this as well but its not as easy to capture and consume. Also, this will help provide sample ATL tests that you can adopt for your code base and pipelines. First, its important to consider that every instance is going to be different. In our example we're using Contoso data and only posting into 3 companies: JPMF, USMF, and USRT. Each local entity will be configured differently so you can only compare each legal entity to itself in terms of performance changes. ATL doesn't simulate an entire process but its a valuable tool in confirming any customizations or extension work as part of a process and also that those changes don't significantly slow the system down.