SQL Server 2022: A Deep Dive into the APPROX_PERCENTILE_CONT Function with JBDB Database

SQL Server 2022 introduces several new features, one of the most exciting being the APPROX_PERCENTILE_CONT function. This function allows for efficient and approximate calculation of percentiles in large datasets, which can be particularly useful for analytics and data-driven decision-making. In this blog, we will explore the APPROX_PERCENTILE_CONT function in detail, using the JBDB database for practical demonstrations. We’ll start with a business use case, dive into the function’s capabilities, and provide a range of T-SQL queries for you to try. Let’s get started! ๐Ÿš€


Business Use Case: Customer Transaction Analysis ๐Ÿ’ผ

Consider a retail company that wants to analyze customer spending behavior. The company has a vast amount of transaction data stored in the JBDB database. To optimize marketing strategies and tailor promotions, they want to identify spending patterns across different customer segments.

For example, the company might want to know the 90th percentile of spending amounts to target high-value customers with exclusive offers. Calculating this percentile accurately in a large dataset can be resource-intensive. The APPROX_PERCENTILE_CONT function offers a solution by providing an approximate, yet efficient, calculation of percentiles.


Understanding the APPROX_PERCENTILE_CONT Function ๐Ÿ“Š

The APPROX_PERCENTILE_CONT function is designed to compute approximate percentile values for a set of data. This function is particularly useful when dealing with large datasets, as it offers a performance advantage by using approximate algorithms.

Syntax:

APPROX_PERCENTILE_CONT ( percentile ) WITHIN GROUP ( ORDER BY numeric_expression )
  • percentile: A value between 0 and 1 that specifies the desired percentile.
  • numeric_expression: The column or expression to calculate the percentile on.

Example 1: Basic Usage ๐ŸŒŸ

Let’s calculate the 90th percentile of customer transaction amounts.

Setup:

USE JBDB;
GO

CREATE TABLE CustomerTransactions (
    TransactionID INT PRIMARY KEY,
    CustomerID INT,
    TransactionAmount DECIMAL(18, 2),
    TransactionDate DATE
);

INSERT INTO CustomerTransactions (TransactionID, CustomerID, TransactionAmount, TransactionDate)
VALUES
(1, 101, 50.00, '2023-01-15'),
(2, 102, 150.00, '2023-01-16'),
(3, 103, 300.00, '2023-01-17'),
(4, 101, 75.00, '2023-01-18'),
(5, 104, 200.00, '2023-01-19'),
(6, 105, 125.00, '2023-01-20'),
(7, 106, 400.00, '2023-01-21'),
(8, 102, 175.00, '2023-01-22');
GO

Query to Calculate 90th Percentile:

SELECT APPROX_PERCENTILE_CONT(0.90) WITHIN GROUP (ORDER BY TransactionAmount) AS Approx90thPercentile
FROM CustomerTransactions;

This result indicates that 90% of transactions are below $375. This insight can help the company focus on high-value customers who spend above this threshold.

Example 2: Analyzing Different Percentiles ๐Ÿ”

Let’s calculate different percentiles to understand the distribution of transaction amounts.

Query to Calculate Multiple Percentiles:

SELECT 
    APPROX_PERCENTILE_CONT(0.25) WITHIN GROUP (ORDER BY TransactionAmount) AS Approx25thPercentile,
    APPROX_PERCENTILE_CONT(0.50) WITHIN GROUP (ORDER BY TransactionAmount) AS Approx50thPercentile,
    APPROX_PERCENTILE_CONT(0.75) WITHIN GROUP (ORDER BY TransactionAmount) AS Approx75thPercentile,
    APPROX_PERCENTILE_CONT(0.90) WITHIN GROUP (ORDER BY TransactionAmount) AS Approx90thPercentile
FROM CustomerTransactions;

These results provide a clear view of the transaction distribution, helping the company to tailor marketing strategies for different customer segments.

Comparing Percentile Results:

  • Compare approximate and exact percentile calculations for the 90th percentile:
SELECT 
    APPROX_PERCENTILE_CONT(0.90) WITHIN GROUP (ORDER BY TransactionAmount) AS Approx90thPercentile,
    PERCENTILE_CONT(0.90) WITHIN GROUP (ORDER BY TransactionAmount) OVER () AS Exact90thPercentile
FROM CustomerTransactions
group by TransactionAmount;

Segmenting Customers by Spending:

  • Identify customers whose spending is in the top 10%:
SELECT CustomerID, TransactionAmount
FROM CustomerTransactions
WHERE TransactionAmount >= (SELECT APPROX_PERCENTILE_CONT(0.90) WITHIN GROUP (ORDER BY TransactionAmount)
                             FROM CustomerTransactions);

Analyzing Spending Patterns Over Time:

  • Calculate monthly spending percentiles to identify trends:
SELECT 
    DATEPART(MONTH, TransactionDate) AS Month,
    APPROX_PERCENTILE_CONT(0.50) WITHIN GROUP (ORDER BY TransactionAmount) AS MedianTransaction
FROM CustomerTransactions
GROUP BY DATEPART(MONTH, TransactionDate)
ORDER BY Month;

Combining Percentiles with Other Aggregations:

  • Find the average transaction amount for each percentile group:
SELECT 
    PercentileGroup,
    AVG(TransactionAmount) AS AvgTransactionAmount
FROM (
    SELECT 
        TransactionAmount,
        NTILE(4) OVER (ORDER BY TransactionAmount) AS PercentileGroup
    FROM CustomerTransactions
) AS SubQuery
GROUP BY PercentileGroup;

Conclusion ๐Ÿ

The APPROX_PERCENTILE_CONT function in SQL Server 2022 is a powerful tool for efficiently computing approximate percentiles in large datasets. By using this function, businesses can gain valuable insights into data distributions and make informed decisions based on these insights. Whether you’re analyzing customer spending, sales trends, or any other data, the APPROX_PERCENTILE_CONT function offers a quick and efficient way to understand your data.

Happy querying! ๐Ÿ˜„

For more tutorials and tips on SQL Server, including performance tuning and database management, be sure to check out ourย JBSWiki YouTube channel.

Thank You,
Vivek Janakiraman

Disclaimer:
The views expressed on this blog are mine alone and do not reflect the views of my company or anyone else. All postings on this blog are provided โ€œAS ISโ€ with no warranties, and confers no rights.

SQL Server 2022: Unleashing the Power of the GENERATE_SERIES Function

In SQL Server 2022, the introduction of the GENERATE_SERIES function marks a significant enhancement, empowering developers and analysts with a flexible and efficient way to generate sequences of numbers. This feature, akin to similar functions in other database systems, simplifies tasks involving sequence generation, such as creating time series data, generating test data, and more.

In this blog, we’ll explore the GENERATE_SERIES function in detail, using the JBDB database to demonstrate its capabilities. We’ll start with a practical business use case, followed by a comprehensive guide on how to use the function. Let’s dive in! ๐ŸŒŸ

Business Use Case: Sales Forecasting ๐Ÿ“ˆ

Imagine you are working for a retail company, and your task is to generate a sales forecast for the next year. You have historical sales data and need to project future sales based on trends. A crucial step in this process is to create a series of dates representing each day of the next year, which will serve as the basis for the forecast.

The GENERATE_SERIES function can be a game-changer here, allowing you to quickly generate a range of dates without resorting to complex loops or recursive queries.

Introducing the GENERATE_SERIES Function ๐Ÿ› ๏ธ

The GENERATE_SERIES function generates a series of numbers or dates. Its syntax is straightforward:

GENERATE_SERIES(start, stop, step)
  • start: The starting value of the sequence.
  • stop: The ending value of the sequence.
  • step: The increment value between each number in the series.

Let’s see this in action with some practical examples!

Example 1: Basic Numeric Series ๐Ÿ”ข

To generate a series of numbers from 1 to 10:

SELECT value
FROM GENERATE_SERIES(1, 10, 1);

Example 2: Date Series for Forecasting ๐Ÿ“…

To generate a series of dates for each day of the next year, starting from January 1, 2023:

SELECT CAST(value AS DATE) AS ForecastDate
FROM GENERATE_SERIES('2023-01-01', '2023-12-31', 1);

Generating a Series of Dates Using a CTE ๐Ÿ“…

Since GENERATE_SERIES supports numeric sequences only, we use a recursive CTE to generate a series of dates. Hereโ€™s how to create a series of dates for the year 2023:

-- Create a recursive CTE to generate a series of dates
WITH DateSeries AS (
    -- Anchor member: start date
    SELECT CAST('2023-01-01' AS DATE) AS ForecastDate
    UNION ALL
    -- Recursive member: add one day to the previous date
    SELECT DATEADD(DAY, 1, ForecastDate)
    FROM DateSeries
    WHERE ForecastDate < '2023-12-31'
)
-- Query to select the generated dates
SELECT ForecastDate
FROM DateSeries
OPTION (MAXRECURSION 0); -- Remove recursion limit

Implementing the Use Case: Sales Forecasting ๐Ÿ“Š

Let’s apply the GENERATE_SERIES function to our sales forecasting scenario. Suppose we have a table Sales in the JBDB database with historical sales data. Our goal is to project future sales for each day of the next year.

Step 1: Creating the JBDB and Sales Table ๐Ÿ—๏ธ

First, we create the JBDB database and the Sales table:

CREATE DATABASE JBDB;
GO

USE JBDB;
GO

CREATE TABLE Sales (
    SaleDate DATE,
    Amount DECIMAL(10, 2)
);

Step 2: Inserting Historical Data ๐Ÿ“ฅ

Next, let’s insert some historical data into the Sales table:

INSERT INTO Sales (SaleDate, Amount)
VALUES
('2022-01-01', 100.00),
('2022-01-02', 150.00),
('2022-01-03', 200.00),
-- Additional data...
('2022-12-31', 250.00);

Step 3: Generating Future Dates and Forecasting ๐Ÿ“…๐Ÿ”ฎ

Now, we use GENERATE_SERIES to generate future dates and join it with our historical data to create a sales forecast:

-- Generate a series of future dates
WITH DateSeries AS (
    SELECT CAST('2023-01-01' AS DATE) AS ForecastDate
    UNION ALL
    SELECT DATEADD(DAY, 1, ForecastDate)
    FROM DateSeries
    WHERE ForecastDate < '2023-12-31'
),
-- Combine with historical sales data
SalesForecast AS (
    SELECT
        f.ForecastDate,
        ISNULL(s.Amount, 0) AS HistoricalAmount
    FROM
        DateSeries f
        LEFT JOIN Sales s ON f.ForecastDate = s.SaleDate
)
-- Project future sales
SELECT
    ForecastDate,
    HistoricalAmount,
    -- Simple projection logic (for demonstration)
    HistoricalAmount * 1.05 AS ProjectedAmount
FROM SalesForecast
OPTION (MAXRECURSION 0); -- Remove recursion limit

In this query:

  • We generate a series of dates for the year 2023 using GENERATE_SERIES.
  • We join these dates with the historical sales data to create a comprehensive sales forecast.
  • A simple projection logic is applied, assuming a 5% increase in sales.

Generate a Series of Numbers with Custom Step Size

Generate a sequence of numbers from 1 to 50 with a step size of 5:

-- Generate a sequence of numbers with a custom step size
SELECT value
FROM GENERATE_SERIES(1, 50, 5);

Generate a Series of Dates with Custom Step Size

Generate a series of dates from today to 30 days into the future with a step size of 5 days:

-- Generate a series of dates with a custom step size (5 days)
WITH DateSeries AS (
    SELECT DATEADD(DAY, value * 5, CAST(GETDATE() AS DATE)) AS ForecastDate
    FROM GENERATE_SERIES(0, 6, 1) -- 0 to 6 will generate 7 dates
)
SELECT ForecastDate
FROM DateSeries;

Generate a Series of Random Numbers

Generate a series of random numbers between 1 and 100:

-- Generate a series of random numbers between 1 and 100
SELECT ABS(CHECKSUM(NEWID())) % 100 + 1 AS RandomNumber
FROM GENERATE_SERIES(1, 10, 1); -- Generate 10 random numbers

Generate a Series of Time Intervals

Generate a series of time intervals (every 15 minutes) for one hour:

-- Generate a series of time intervals (15 minutes) for one hour
WITH TimeSeries AS (
    SELECT DATEADD(MINUTE, value * 15, CAST('2024-01-01 00:00:00' AS DATETIME)) AS TimeStamp
    FROM GENERATE_SERIES(0, 3, 1) -- 0 to 3 will generate 4 intervals
)
SELECT TimeStamp
FROM TimeSeries;

Generate a Series of Sequential IDs

Generate a series of sequential IDs from 1001 to 1010:

-- Generate a sequence of sequential IDs
SELECT value + 1000 AS SequentialID
FROM GENERATE_SERIES(1, 10, 1);

Generate a Series of Numeric Values with Non-Uniform Steps

Generate a series of numbers with varying steps (e.g., 1, 2, 4, 8, …):

-- Generate a series of numbers with varying steps (powers of 2)
WITH NumberSeries AS (
    SELECT 1 AS value
    UNION ALL
    SELECT value * 2
    FROM NumberSeries
    WHERE value < 64
)
SELECT value
FROM NumberSeries
OPTION (MAXRECURSION 0);

Generate a Series of Dates with Monthly Intervals

Generate a series of dates with a monthly interval for one year:

-- Generate a series of dates with monthly intervals for one year
WITH MonthSeries AS (
    SELECT DATEADD(MONTH, value, CAST('2024-01-01' AS DATE)) AS MonthStart
    FROM GENERATE_SERIES(0, 11, 1) -- 0 to 11 will generate 12 months
)
SELECT MonthStart
FROM MonthSeries;

Generate a Series of Numbers and Calculate Cumulative Sum

Generate a series of numbers and calculate their cumulative sum:

-- Generate a series of numbers and calculate the cumulative sum
WITH NumberSeries AS (
    SELECT value
    FROM GENERATE_SERIES(1, 10, 1)
),
CumulativeSum AS (
    SELECT
        value,
        SUM(value) OVER (ORDER BY value) AS CumulativeSum
    FROM NumberSeries
)
SELECT value, CumulativeSum
FROM CumulativeSum;

Generate a Series of Custom Random Dates

Generate a series of random dates within a specific range:

— Generate a series of random dates within a specific range
WITH RandomDates AS (
SELECT DATEADD(DAY, ABS(CHECKSUM(NEWID())) % 365, CAST(‘2024-01-01’ AS DATE)) AS RandomDate
FROM GENERATE_SERIES(1, 10, 1) — Generate 10 random dates
)
SELECT RandomDate
FROM RandomDates;

Generate a Series of Numbers and Create Custom Labels

Generate a series of numbers and create custom labels:

— Generate a series of numbers and create custom labels
SELECT value AS Number, ‘Label_’ + CAST(value AS VARCHAR(10)) AS CustomLabel
FROM GENERATE_SERIES(1, 10, 1);

Conclusion ๐ŸŒŸ

The GENERATE_SERIES function in SQL Server 2022 is a versatile tool that can significantly simplify the generation of sequences, whether for numeric ranges or date series. Its applications range from creating time series data for analytics to generating test data for development and testing purposes.

By leveraging GENERATE_SERIES, businesses can streamline their data workflows, enhance forecasting accuracy, and improve decision-making processes. Whether you’re a database administrator, developer, or data analyst, this function is a valuable addition to your SQL toolkit.

Feel free to experiment with GENERATE_SERIES and explore its potential in your projects! ๐ŸŽ‰

For more tutorials and tips on SQL Server, including performance tuning and database management, be sure to check out our JBSWiki YouTube channel.

Thank You,
Vivek Janakiraman

Disclaimer:
The views expressed on this blog are mine alone and do not reflect the views of my company or anyone else. All postings on this blog are provided โ€œAS ISโ€ with no warranties, and confers no rights.

SQL Server 2022 Query Store Enhancements: A Comprehensive Guide

SQL Server 2022 brings significant enhancements to the Query Store, a powerful feature for monitoring and optimizing query performance. In this blog, we’ll explore the improvements, how to leverage Query Store for performance tuning, and its application in Always On Availability Groups. We’ll also provide T-SQL queries to identify costly queries and discuss the advantages and business use cases of using Query Store.

What is Query Store? ๐Ÿค”

Query Store is a feature in SQL Server that captures a history of queries, plans, and runtime statistics. It helps database administrators (DBAs) and developers identify and troubleshoot performance issues by providing insights into how queries are performing over time.

Key Enhancements in SQL Server 2022 ๐Ÿ› ๏ธ

  1. Support for Always On Availability Groups Read Replicas: One of the standout features in SQL Server 2022 is the extension of Query Store to read-only replicas in Always On Availability Groups. This allows monitoring of read workload performance without affecting the primary replica’s performance.
  2. Improved Query Performance Analysis: Enhancements in Query Store provide more granular control over data collection and retention policies, allowing for more precise performance tuning.
  3. Automatic Plan Correction: Query Store can automatically identify and revert to a previously good query plan if the current plan causes performance regressions.
  4. Enhanced Data Cleanup: SQL Server 2022 introduces more efficient data cleanup processes, ensuring that Query Store doesn’t consume unnecessary storage space.

Leveraging Query Store for Performance Tuning ๐ŸŽ›๏ธ

To make the most of Query Store, follow these steps:

Enable Query Store: Ensure that Query Store is enabled for your database. You can do this using the following T-SQL command.

    ALTER DATABASE [YourDatabaseName] SET QUERY_STORE = ON;

    Monitor Performance: Use Query Store views and built-in reports in SQL Server Management Studio (SSMS) to analyze query performance over time.

    Identify Regressions: Leverage the Automatic Plan Correction feature to detect and fix query performance regressions automatically.

    Optimize Queries: Use the insights from Query Store to optimize queries and indexes, reducing resource consumption and improving response times.

    Using Query Store on Always On Read Replicas ๐Ÿ›ก๏ธ

    Query Store on read replicas allows you to monitor read-only workloads without impacting the primary replica. To enable and configure Query Store on read replicas, use the following steps:

    Enable Query Store on Primary and Read Replicas: Ensure that Query Store is enabled on both primary and secondary replicas.

      ALTER DATABASE [YourDatabaseName] SET QUERY_STORE = ON (OPERATION_MODE = READ_WRITE);

      On read replicas:

      ALTER DATABASE [YourDatabaseName] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);

      Monitor Read Workloads: Use Query Store to analyze read workload performance on secondary replicas. This helps in identifying and optimizing queries executed on read-only replicas.

      T-SQL Queries to Check Costly Queries ๐Ÿ”

      Here are some T-SQL queries to find costly queries in terms of CPU, reads, and duration:

      On Primary Replica

      Top Queries by CPU Usage:

      SELECT TOP 10
          qs.query_id,
          qs.execution_type_desc,
          qs.total_cpu_time / qs.execution_count AS avg_cpu_time,
          q.text AS query_text
      FROM
          sys.query_store_runtime_stats qs
      JOIN
          sys.query_store_query q ON qs.query_id = q.query_id
      ORDER BY
          avg_cpu_time DESC;

      Top Queries by Logical Reads:

      SELECT TOP 10
          qs.query_id,
          qs.execution_type_desc,
          qs.total_logical_reads / qs.execution_count AS avg_logical_reads,
          q.text AS query_text
      FROM
          sys.query_store_runtime_stats qs
      JOIN
          sys.query_store_query q ON qs.query_id = q.query_id
      ORDER BY
          avg_logical_reads DESC;

      Top Queries by Duration:

      SELECT TOP 10
          qs.query_id,
          qs.execution_type_desc,
          qs.total_duration / qs.execution_count AS avg_duration,
          q.text AS query_text
      FROM
          sys.query_store_runtime_stats qs
      JOIN
          sys.query_store_query q ON qs.query_id = q.query_id
      ORDER BY
          avg_duration DESC;

      On Read Replica

      The queries on the read replica are similar but consider that the Query Store on read replicas operates in a read-only mode:

      -- For CPU Usage, Logical Reads, and Duration, the same queries as above can be used.

      Advantages of Using Query Store ๐ŸŒŸ

      1. Historical Performance Data: Query Store maintains historical data, making it easier to analyze and troubleshoot performance issues over time.
      2. Automated Plan Correction: Automatically detects and corrects query plan regressions, reducing the need for manual intervention.
      3. Enhanced Monitoring: Extended support to read replicas allows comprehensive monitoring of all workloads in Always On Availability Groups.
      4. Improved Resource Management: Helps in identifying resource-intensive queries, enabling better resource allocation and management.

      Business Use Case: E-commerce Website ๐Ÿ›’

      Consider an e-commerce platform where performance is critical, especially during peak shopping seasons. By leveraging Query Store:

      • The DBA can monitor and optimize queries that retrieve product details, prices, and inventory status, ensuring quick response times for users.
      • Automatic Plan Correction helps maintain optimal performance even when changes are made to the database or application code.
      • Using Query Store on read replicas allows offloading read workloads from the primary replica, ensuring that write operations remain unaffected.

      Conclusion ๐ŸŽ‰

      SQL Server 2022’s Query Store enhancements offer a powerful toolset for monitoring and optimizing database performance. Whether you’re managing a high-traffic e-commerce site or a critical financial application, leveraging Query Store can lead to significant performance improvements and resource optimization. Start exploring these features today to get the most out of your SQL Server environment!

      For more tutorials and tips on SQL Server, including performance tuning and database management, be sure to check out our JBSWiki YouTube channel.

      Thank You,
      Vivek Janakiraman

      Disclaimer:
      The views expressed on this blog are mine alone and do not reflect the views of my company or anyone else. All postings on this blog are provided โ€œAS ISโ€ with no warranties, and confers no rights.