AWS Clean Rooms pricing overview
With AWS Clean Rooms, you can use flexible SQL analysis rules, privacy-enhancing machine learning (ML), and AWS Entity Resolution on AWS Clean Rooms to meet your business needs. When you use SQL analysis, you or a designated collaborator pay for the compute capacity of the Spark SQL or SQL queries run in a collaboration, on a clean rooms processing unit (CRPU)–hours basis. Learn more about CRPU-hours below. When you use AWS Clean Rooms ML custom modeling, you pay on a price-per-1,000-records basis used in training, inference, or both, usage of the compute instance type you choose, and compute capacity of the Spark SQL queries run to create the input data from training and inference in a collaboration. When you use AWS Clean Rooms ML lookalike modeling, you only pay for the model trainings you request, and for the lookalike segments created, on a price-per-1,000-profiles basis. When you use AWS Entity Resolution on AWS Clean Rooms, you pay on a price-per-1,000-records.
Note: Pricing can vary per AWS Region depending on which capabilities you use, and the AWS Free Tier is not available for AWS Clean Rooms ML or AWS Entity Resolution.
AWS Clean Rooms is available in the following AWS Regions: US East (N. Virginia), US East (Ohio), US West (Oregon), Asia Pacific (Seoul), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Europe (Frankfurt), Europe (Ireland), Europe (London), and Europe (Stockholm).
AWS Free Tier
Get 9 CRPU-hours per month for the first 12 months when using SQL aggregation and list analysis rules with the AWS Clean Rooms Free Tier. Learn more about CRPU-hours below.
-
SQL pricing
-
AWS Clean Rooms ML pricing
-
AWS Entity Resolution on AWS Clean Rooms pricing
-
Additional costs
-
SQL pricing
-
AWS Clean Rooms offers Spark SQL or SQL analytics engines to run queries in a Clean Rooms collaboration.
You can choose to use the Spark analytics engine to run queries using the Spark SQL dialect in AWS Clean Rooms collaborations. AWS Clean Rooms Spark SQL offers configurable compute sizes to provide more control over price performance when running SQL workloads. To apply AWS Clean Rooms Differential Privacy or use aggregation or list analysis rules in a collaboration you must use SQL as the analytics engine.
-
Spark SQL pricing
-
SQL pricing
-
Spark SQL pricing
-
AWS Clean Rooms measures compute capacity in Clean Rooms Processing Unit (CRPU)-hours, on a per-second basis (with a 60-second minimum charge). There are no resources to manage and no upfront costs, and you are not charged for startup or shutdown time. When you run Spark SQL queries on AWS Clean Rooms, payment responsibilities can be configured to either the collaborator running the queries or to any of the members participating in a collaboration. The member responsible for payment will be billed for all queries in the collaboration. AWS Clean Rooms Spark SQL is only available for the custom analysis rule.
AWS Clean Rooms Spark SQL charges an hourly rate based on the number of CRPUs used to run your query. You pay for the usage of compute on a price-per-CRPU-hour, with the option to choose from different instances to run your queries. You can choose from 4 compute engine configuration options to run your queries, based on your performance, scale, and cost requirements. By default, AWS Clean Rooms allocates 32 CRPUs to each Spark SQL query, and you can optionally choose workload sizes up to 256 CRPUs or as low as 4 CRPUs.
Instance Type Instances Total CRPU-hour CR.1X
2 4 CR.1X (default) 16 (default) 32 (default) CR.4X 8 64 CR.4X 32 256 Note: You can choose a compute engine configuration with more instances to allocate more resources for your Spark SQL queries. A higher compute engine configuration will distribute the workload across more instances to meet your job requirements and limits. Learn more about the associated vCPU, memory, and storage for each configuration here.
Spark SQL custom analysis rule pricing dimension
Spark SQL compute: You pay for the length of time that your Spark SQL queries take to run on a price-per-CRPU-hour rate. AWS Clean Rooms Spark SQL Compute pricing varies by AWS region.
Spark SQL custom analysis rule pricing examples
Example 1 – Spark SQL query (using the default CR.1X with 16 instances)
You want to use Spark SQL queries to run custom analysis across configured tables from multiple collaboration members. Your Spark SQL query runs for 3 minutes, and needs to be processed three times per day in an AWS Clean Rooms collaboration in US East (N. Virginia). You want to use the default AWS Clean Rooms Spark SQL compute engine configuration with CR.1X and 16 instances, which uses a total capacity of 32 CRPUs per hour to run the queries.
The following table summarizes your total usage for the day and the year:
Query execution period The query was run three times per day, each taking 3 minutes = 9 minutes = 540 seconds / 3,600 = 0.150 hours Capacity used 4.8 CRPUs = (0.150 hours * 32 CRPU-hour using CR.1X and 16 instances) Daily charges $9.60 = (4.8 CRPUs * $2.00 per CRPU-hour) Yearly charges $3,504.00 = $9.60 * 365 Example 2 – Spark SQL query (using CR.4X with 8 instances)
You want to use Spark SQL queries to run custom analysis across configured tables from multiple collaboration members. Your Spark SQL query runs once per day for 3 minutes in an AWS Clean Rooms collaboration in US East (N. Virginia). You choose to use an AWS Clean Rooms Spark SQL compute engine configuration with CR.4X and 8 instances, which uses a total capacity of 64 CRPUs per hour to run the queries.
The following table summarizes your total usage for the day and the year:
Query execution period The query was run once, taking 3 minutes = 180 seconds / 3,600 = 0.050 hours Capacity used 3.2 CRPUs = (0.050 hours * 64 CRPU-hour using CR.4X and 8 instances) Daily charges $6.40 = (3.2 CRPUs * $2.00 per CRPU-hour) Yearly charges $2,336.00 = $6.40 * 365 -
SQL pricing
-
AWS Clean Rooms measures compute capacity in CRPU-hours on a per-second basis (with a 60-second minimum charge). The base capacity for AWS Clean Rooms SQL collaborations is 32 CRPUs—this capacity can scale up or down automatically based on usage patterns. AWS Clean Rooms automatically scales up or down to meet your query workload demands and shuts down during periods of inactivity, saving you administration time and costs. AWS Clean Rooms SQL pricing varies by AWS Region and type of analysis rule.
When you run SQL queries on AWS Clean Rooms, payment responsibilities can be configured to either the collaborator running the queries or to any of the members participating in a collaboration. The member responsible for payment will be billed for all queries in the collaboration.
-
Custom analysis rule pricing
Custom analysis rule pricing example
You want to run A/B test performance queries across configured tables with a custom analysis rule from multiple collaboration members. The query needs to be processed three times from 7 AM to 7 PM in an AWS Clean Rooms collaboration in US East (N. Virginia), with an average job completion time of 3 minutes. AWS Clean Rooms is using default capacity of 32 CRPUs to run the queries.
The following table summarizes your total usage for the day:
Query execution period The query was run three times from 7 AM to 7 PM, each taking 3 minutes = 9 minutes = 540 seconds / 3,600 = 0.150 hours Capacity used 32 CRPUs Daily charges $9.60 = (0.150 hours * 32 CRPUs * $2.00 per CRPU-hour) Yearly charges $3,504.00 = $9.60 * 365
-
Aggregation and list analysis rules pricing
Aggregation and list analysis rules pricing example
You want to run customer overlap queries on a configured table with an aggregation analysis rule. The query needs to be processed three times from 7 AM to 7 PM in an AWS Clean Rooms collaboration in US East (N. Virginia), with an average job completion time of 2 minutes and 30 seconds. AWS Clean Rooms is using default capacity of 32 CRPUs to run the queries.
The following table summarizes your total usage for the day:
Query execution period The query was run three times from 7 AM to 7 PM., each taking 2 minutes and 30 seconds = 7 minutes and 30 seconds = 450 seconds / 3600 = 0.125 hours Capacity used 32 CRPUs Daily charges $2.64 = (0.125 hours * 32 CRPUs * $0.656 per CRPU-hour) Yearly charges $963.6 = $2.64 * 365
-
AWS Clean Rooms Differential Privacy pricing
Custom analysis rule pricing example (with AWS Clean Rooms Differential Privacy enabled)
You want to run advertising reach queries on a configured table with a custom analysis rule that uses AWS Clean Rooms Differential Privacy for an additional layer of protection. The total cost per CRPU-hour is $4.00 ($2.00 per CRPU-hour for SQL compute + $2.00 per CRPU-hour for AWS Clean Rooms Differential Privacy). The query needs to be processed once from 7 a.m. to 7 p.m. in an AWS Clean Rooms collaboration in US East (N. Virginia), with an average job completion time of 3 minutes. AWS Clean Rooms is using default capacity of 32 CRPUs to run the queries.
The following table summarizes your total usage for the day and the year:
Query execution period The query was run once from 7 AM to 7 PM and took 3 minutes = 180 seconds / 3600 = 0.05 hours Capacity used 32 CRPUs Daily charges $6.40 = (0.05 hours x 32 CRPUs * $4.00 per CRPU-hour) Yearly charges $2,336.00 = $6.40 * 365
-
-
-
AWS Clean Rooms ML pricing
-
AWS Clean Rooms ML supports custom and lookalike machine learning (ML) modeling. With custom modeling, you can bring a custom model for training and run inference on collective datasets, without sharing underlying data or intellectual property among collaborators. With lookalike modeling, you can use an AWS-authored model to generate an expanded set of similar profiles based on a small sample of profiles that your partners bring to a collaboration.
Note: AWS Free Tier is not available for AWS Clean Rooms ML.
-
Custom modeling pricing
-
Lookalike modeling pricing
-
Custom modeling pricing
-
When you run AWS Clean Rooms ML custom modeling, you pay for training, inference, or both, according to three dimensions of costs which include the number of records on a price-per-1,000-records basis, usage of the compute instance type you choose, and compute capacity of the Spark SQL queries run to create the input data from training and inference in a collaboration. See below for details on the three dimensions of costs.
Note: To apply AWS Clean Rooms ML custom modeling, you must use Spark SQL as the analytics engine. See AWS Clean Rooms Spark SQL pricing for details.
Custom modeling pricing dimension
1. Number of records: You pay for the number of records on a price-per-1,000-records basis.
Note: Pricing for the number of records for training and inference does not vary per AWS Region.
2. Custom modeling compute: You pay for the usage of the compute instance type you choose and length to complete training and inference.
Note: AWS Clean Rooms ML custom modeling compute pricing can vary per AWS Region depending on which capabilities you use. You are billed for compute based on the length of each training and inference job you run.
3. Spark SQL compute: You pay for the length of time that your Spark SQL queries take to run on a price-per-CRPU-hour rate with the option to choose from different instances to run your queries. You can choose from 4 compute engine configuration options to run your queries, based on your performance, scale, and cost requirements. By default, AWS Clean Rooms allocates 32 CRPUs to each Spark SQL query, and you can optionally choose workload sizes up to 256 CRPUs or as low as 4 CRPUs.
Note: To apply AWS Clean Rooms ML custom modeling, you must use Spark SQL as the analytics engine. See AWS Clean Rooms Spark SQL pricing for details.
Custom modeling pricing example (for training)
You want to use AWS Clean Rooms ML custom modeling to train a proprietary model to detect fraudulent transactions with another financial institution. You want to train this model leveraging a collective dataset consisting of 30,000,000 transaction records from you and another collaborator. You want to use ml.p3.8xlarge instance, with each training job averaging 6 hours for completion. Your Spark SQL query to pull the list of suspect transactions runs for 1 hour. You want to use AWS Clean Rooms Spark SQL compute engine configuration with CR.1X and 16 instances, which uses a total capacity of 32 CRPUs per hour to run the queries.
The following table summarizes your usage and charges in US East (N. Virginia):
Number of records in training dataset
30 million records
$300.00 = 30M * $0.01 per 1,000 records
Custom modeling compute
ml.p3.8xlarge for 6 hours
$88.128 = $14.688 x 6 hours
Spark SQL compute for training data 32 CRPU-hour using CR.1X and 16 instances for 1 hour
$64.00 = (1 hour * 32 CRPUs * $2.00 per CRPU-hour)
Total per training charges
$452.128 = $300.00 + $88.128 + $64.00
Custom modeling pricing example (for inferencing)
You want to use AWS Clean Rooms ML custom modeling to predict the likelihood of prospective customers clicking your ads on an e-commerce website. You want to run inference on a dataset consisting of 30,000,000 customer records from your e-commerce partner. You want to use ml.m5.4xlarge, with each inference job averaging 2 hours for completion. The Spark SQL query runs for 1 hour to generate the data for inference. You want to use AWS Clean Rooms Spark SQL compute engine configuration with CR.4X and 8 instances, which uses a total capacity of 64 CRPUs per hour to run the queries.
The following table summarizes your usage and charges in US East (N. Virginia):
Number of records in inference dataset
30 million records
$75.00 = 30M * $0.0025 per 1,000 records
Custom modeling compute
ml.m5.4xlarge for 2 hours
$1.844 = $0.922 x 2 hours
Spark SQL compute for inference 64 CRPU-hour using CR.4X and 8 instances for 1 hour
$128.00 = (1 hour * 64 CRPUs * $2.00 per CRPU-hour)
Total per inferencing charges
$204.844 = $75.00 + $1.844 + $128.00
-
Lookalike modeling pricing
-
When you run AWS ML lookalike modeling in AWS Clean Rooms ML, you only pay for the AWS-authored model trainings you request and for the lookalike segments created, on a price-per-1,000-profiles basis. The model owner is billed for both training and segment generation jobs.
Note: Pricing does not vary per AWS Region.
Lookalike modeling pricing dimension
Dimension Price
Price per 1,000 profiles for training dataset $0.04 per 1,000 profiles Price per 1,000 profiles in a lookalike segment $0.25 per 1,000 profiles Lookalike modeling pricing example
You want to use AWS Clean Rooms ML lookalike modeling to train the AWS-authored model and associate interaction data about 50,000,000 customers. A partner you are collaborating with requests 10 lookalike segments in a week, with an average size of 2,000,000 profiles per segment.
The following table summarizes your weekly usage and charges:
Number of profiles in training dataset (weekly)
50 million profiles
$2000 = 50M * $0.04 per 1,000 profiles
Number of profiles per segment
2 million profiles
$500 = 2M * $0.25 per 1000 profiles
Number of segments 10
$5000 = 10 * $500 per segment
Total weekly charges
$7,000 = $2,000 + $5,000
-
-
AWS Entity Resolution on AWS Clean Rooms pricing
-
When you use AWS Entity Resolution on AWS Clean Rooms, you are charged per 1,000-records. You can prepare your data and match records with your collaborators’ datasets using rule-based matching or data service provider-based matching leveraging provider data sets (such as LiveRamp).
Notes: Pricing does not vary per AWS Region, and the AWS Free Tier is not available for AWS Entity Resolution on AWS Clean Rooms. If you use data service provider-based matching, you must have a subscription in place. Pricing does not include any fees charged by third parties for the use of their services. You can use the public subscriptions listed on AWS Data Exchange (ADX), or purchase a private subscription directly with the data service provider of your choice, and then use Bring Your Own Subscription (BYOS) to ADX. To use AWS Entity Resolution outside of AWS Clean Rooms, learn more about its pricing here.
AWS Entity Resolution on AWS Clean Rooms is available in the following AWS Regions: Rule-based matching is only available in US East (Ohio), US East (N. Virginia), US West (Oregon), Asia Pacific (Seoul), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Europe (Frankfurt), Europe (Ireland), and Europe (London). Data service provider-based matching is only available in US East (Ohio), US East (N. Virginia), US West (Oregon).
-
Data preparation pricing
-
Data matching pricing
-
Data preparation pricing
-
Data preparation pricing dimension
When you use rule-based matching, at least one member in a collaboration is required to prepare their data prior to matching with their partners' data sets. When you use data service provider-based matching, all collaboration members are required to prepare their data set with provider IDs prior to data matching with their partners' data sets.
Dimension Price Price per 1,000 records for data preparation
$0.10 per 1,000 records processed
Notes: Prior to joining an AWS Clean Rooms collaboration, if you have already used AWS Entity Resolution to prepare your data, you can use that dataset for matching in an AWS Clean Rooms collaboration. You are not required to do data preparation again.
-
Data matching pricing
-
Data matching pricing dimension
Any member in a collaboration can pay for data matching. For rule-based matching, one collaborator is required to pay a one-time $100.00 matching fee per collaboration, and this fee is assigned to any collaborator paying for data matching.
Matching Technique Dimension Price Rule-based Price per 1,000 records for data matching $0.50 per 1,000 records matched Data service provider-based* Price per 1,000 records for data service provider-based matching $0.10 per 1,000 records processed *For data service provider-based matching, all members are required to prepare their data set prior to data matching with provider IDs.
Ruled-based matching pricing example
You want to use AWS Entity Resolution on AWS Clean Rooms with your collaborators to match records using rule-based matching. Your dataset has 1,000,000 records. You will run this matching once and for all records. You want to first prepare your data, and then match records with your collaborator. After running the rule-based matching workflow with AWS Entity Resolution, you get a 60% match rate (60% is an example for pricing illustration; match rates vary on a case-by-case basis). All members in the collaboration agree that you will be the payor for data preparation, matching, and base fee.
The following table summarizes your total usage for the day:
Number of records processed for data preparation 1,000,000 $100.00 = 1M records * $0.10 per 1,000 records Number of records matched for data matching 600,000 $300.00 = 1M records * 60% records matched * $0.50 per 1,000 records Base fee for data matching $100.00 $100.00 = Base fee for data matching per collaboration Total charges $500.00 = $100.00 + $300.00 + $100.00
Data service provider-based matching pricing example
You want to use AWS Entity Resolution on AWS Clean Rooms with your collaborators to match records using data service provider-based matching with LiveRamp (RampIDs). You and your collaborator have prepared your datasets with provider IDs. Your dataset has 1,000,000 records. You want to match your data with your collaborators’ data, consisting of 5,000,000 records, however your collaborators' dataset size does not impact your charges because you only pay for records processed. All members in the collaboration agree that you will be the payor, however if your collaborator were the payor they would still pay for 1,000,000 processed records by specifying which list of records will be processed, which in this case is 1,000,000 records. You use the data service provider-based matching technique to match using LiveRamp. To use LiveRamp, you already have an existing provider license, which is required to use this matching technique.
The following table summarizes your total charges:
Number of records processed for data matching 1,000,000 $100.00 = 1M records * $0.10 per 1,000 records Total charges $100.00 (in addition to provider subscription costs) Notes: If you use and pay for data service-provider based matching, you must have a provider subscription in place. Pricing does not include any fees charged by third parties for the use of their services. You can use the public subscriptions listed on AWS Data Exchange (ADX), or purchase a private subscription directly with the data service provider of your choice, and then use Bring Your Own Subscription (BYOS) to ADX. All members are required to prepare their data set with provider IDs prior to data matching.
-
-
Additional costs
-
AWS Clean Rooms queries data from Amazon Simple Storage Service (Amazon S3) and metadata from the AWS Glue Data Catalog. There are no additional storage charges for querying your data with AWS Clean Rooms. Each collaboration member who contributes data to a collaboration will be charged standard Amazon S3 API and retrieval fees and AWS Glue Data Catalog API fees when their datasets are used in queries.
- You are billed by S3 when your workloads read, store, and transfer data. Query results are stored in an S3 bucket of your choice and billed at standard S3 rates. For more information, see Amazon S3 pricing.
- You are billed by AWS Glue for the requests made to the AWS Glue Data Catalog. For more information, see AWS Glue pricing.