AWS Pricing Calculator
Calculate your Amazon DynamoDB and architecture cost in a single estimate.
Create your custom estimate now
With provisioned capacity mode, you specify the number of data reads and writes per second that you require for your application. You can use auto scaling to automatically adjust your table’s capacity based on the specified utilization rate to ensure application performance while reducing costs. This pricing page details how DynamoDB charges for the core and optional features of DynamoDB. For pricing in AWS China Regions, see the AWS China Regions pricing page.
Key terms
Read capacity unit (RCU): Each API call to read data from your table is a read request. Read requests can be strongly consistent, eventually consistent, or transactional. For items up to 4 KB in size, one RCU can perform one strongly consistent read request per second. Items larger than 4 KB require additional RCUs. For items up to 4 KB in size, one RCU can perform two eventually consistent read requests per second. Transactional read requests require two RCUs to perform one read per second for items up to 4 KB. For example, a strongly consistent read of an 8 KB item would require two RCUs, an eventually consistent read of an 8 KB item would require one RCU, and a transactional read of an 8 KB item would require four RCUs. See Read Consistency for more details.
Write capacity unit (WCU): Each API call to write data to your table is a write request. For items up to 1 KB in size, one WCU can perform one standard write request per second. Items larger than 1 KB require additional WCUs. Transactional write requests require two WCUs to perform one write per second for items up to 1 KB. For example, a standard write request of a 1 KB item would require one WCU, a standard write request of a 3 KB item would require three WCUs, and a transactional write request of a 3 KB item would require six WCUs.
Replicated write capacity unit (rWCU): When using DynamoDB global tables, your data is written automatically to multiple AWS Regions of your choice. Each write occurs in the local Region as well as the replicated Regions.
Streams read request unit: Each GetRecords API call to DynamoDB Streams is a streams read request unit. Each streams read request unit can return up to 1 MB of data.
Transactional read/write requests: In DynamoDB, a transactional read or write differs from a standard read or write because it guarantees that all operations contained in a single transaction set succeed or fail as a set.
Change data capture units: DynamoDB can capture item-level changes in your DynamoDB tables and replicate them to other AWS services such as Amazon Kinesis Data Streams and AWS Glue. DynamoDB captures these changes as delegated operations, which means DynamoDB performs the replication on your behalf so that you don’t have to manage throughput capacity. DynamoDB charges one change data capture unit for each write to your table (up to 1 KB). For items larger than 1 KB, additional change data capture units are required.
DynamoDB table classes: DynamoDB offers two table classes designed to help you optimize for cost. The DynamoDB Standard table class is the default and recommended for the vast majority of workloads. The DynamoDB Standard-Infrequent Access (DynamoDB Standard-IA) table class is optimized for tables that store data that is accessed infrequently, where storage is the dominant cost. Each table class offers different pricing for data storage as well as read and write requests. You can select the most cost-effective table class based on your table’s storage requirements and data access patterns. Learn more about DynamoDB table classes in the DynamoDB Developer Guide.
DynamoDB features and billing overview
Feature |
What it does |
Billing unit |
|
Core features |
|||
|
Provisioned write capacity |
Writes data to your table |
WCU |
|
Provisioned read capacity |
Reads data from your table |
RCU |
|
Data storage |
Stores data, including index values |
GB-month |
Optional features |
|||
|
Continuous backup |
Takes continuous backups for the preceding 35 days |
GB-month |
|
On-demand backup |
Takes snapshot backups at specified points in time |
GB-month |
|
Restore from backup |
Restores a table to a specific snapshot or time |
GB |
|
Global tables |
Replicates data to create a multi-Region, multi-active table |
rWCU |
Change data capture for Amazon Kinesis Data Streams | Captures item-level modifications in any DynamoDB table and replicates them to a Kinesis data stream of your choice | Change data capture unit | |
Change data capture for AWS Glue | Captures item-level data changes on a table and replicates them to AWS Glue | Change data capture unit | |
Data export to Amazon S3 | Exports DynamoDB table backups from a specific point in time to Amazon S3 | GB | |
Data import from Amazon S3 | Migrate and load data from Amazon S3 to new DynamoDB tables | GB | |
|
DynamoDB Streams |
Provides a time-ordered sequence of item-level changes on a table |
Streams read request unit |
|
Data transfer out |
Transfers data to other AWS Regions |
GB |
Integrations with DynamoDB billing overview
Integration |
What it does |
Billing unit |
|
|
Integration with DynamoDB Accelerator (DAX), a DynamoDB-compatible caching service |
Improves price performance and reduces latency from milliseconds to microseconds |
Node-hour |
Zero-ETL integration with Amazon OpenSearch Service | Enables full-text search, vector search, semantic search, geospatial search, and more without building and managing data pipelines | GB of exports | |
Zero-ETL integration with Amazon Redshift | Enables analytics on operational data without building and managing data pipelines | GB of exports |
DynamoDB pricing
-
• Read and write requests
Provisioned capacity
When you select provisioned capacity mode, you specify the read and write capacity that you expect your application to require. You can use auto scaling to automatically adjust your table’s capacity based on the specified utilization rate to ensure application performance while reducing costs. DynamoDB charges one WCU for each write per second (up to 1 KB) and two WCUs for each transactional write per second. For reads, DynamoDB charges one RCU for each strongly consistent read per second, two RCUs for each transactional read per second, and one-half of an RCU for each eventually consistent read per second (up to 4 KB). You will be charged for the throughput capacity (reads and writes) you provision in your Amazon DynamoDB table, even if you do not fully utilize the provisioned capacity. The price for provisioned capacity depends on your table class. The actual reads and writes performance of your DynamoDB tables may vary and may be less than the throughput capacity that you provision.
Reserved capacity
DynamoDB reserved capacity can help you save on your provisioned capacity costs by making an upfront commitment on your base level of provisioned capacity. With reserved capacity, you pay a one-time upfront fee and commit to a minimum provisioned usage level over a period of time. Reserved capacity is billed at a discounted hourly rate. Any capacity that you provision in excess of your reserved capacity is billed at undiscounted provisioned capacity rates. Reserved capacity is available for single-region, provisioned read and write capacity units (RCU and WCU) on DynamoDB tables that use the DynamoDB Standard table class. Reserved capacity is not available for tables that use the DynamoDB Standard-IA table class, or on-demand capacity.
You may purchase DynamoDB reserved capacity by submitting a request through the AWS Management Console. Reserved capacity is purchased in blocks of 100 WCUs or 100 RCUs. You cannot purchase reserved capacity for replicated WCUs (rWCUs). When you purchase reserved capacity, you must designate an AWS Region, quantity, and term. You will be charged (1) a one-time upfront fee, and (2) an hourly fee for each hour during the term based on the amount of DynamoDB reserved capacity you purchase. DynamoDB reserved capacity is also subject to all storage, data transfer, and other fees applicable under the AWS Customer Agreement or other agreement with us governing your use of our services.
-
• Data storage
You do not need to provision storage: DynamoDB monitors the size of your tables continuously to determine your storage charges. DynamoDB measures the size of your billable data by adding the raw byte size of your data plus a per-item storage overhead that depends on the features you have enabled. See the DynamoDB User Guide to learn more. The price for data storage depends on your table class.
-
• Backup and restore
DynamoDB offers two methods to back up your table data. Continuous backups with point-in-time recovery (PITR) provide an ongoing backup of your table for the preceding 35 days. You can restore your table to the state of any specified second in the preceding five weeks. On-demand backups create snapshots of your table to archive for extended periods to help you meet corporate and governmental regulatory requirements.
Continuous backups (PITR)
DynamoDB charges for PITR based on the size of each DynamoDB table (table data and local secondary indexes) on which it is enabled. DynamoDB monitors the size of your PITR-enabled tables continuously throughout the month to determine your backup charges and continues to bill you until you disable PITR on each table.
On-demand backup
DynamoDB charges for on-demand backups based on the storage size of the table (table data and local secondary indexes). The size of each backup is determined at the time of each backup request. The total backup storage size billed each month is the sum of all backups of DynamoDB tables. DynamoDB monitors the size of on-demand backups continuously throughout the month to determine your backup charges.
You can use DynamoDB or AWS Backup to create and manage on-demand backups. To learn more, see Using On-Demand Backup and Restore. With AWS Backup you can centralize and automate data protection across AWS services. AWS Backup also offers advanced features such cross-account and cross-Region on-demand backup copying, low-cost storage tier, backup tagging, and backup encryption that is independent from its source data to help meet your business continuity requirements and optimize backup costs. Additional charges apply for cross-Region data transfer. For more information about these charges, see AWS Backup pricing.
* Cold backup storage is supported for on-demand backups that are managed by AWS Backup only. You can opt-in to use AWS Backup from the AWS Management Console.
Backups that are transitioned to Cold Storage have a minimum 90 days of storage, and backups deleted before 90 days incur a pro-rated charge equal to the storage charge for the remaining days.
Restoring a table
Restoring a table from on-demand backups or PITR is charged based on the total size of data restored (table data, local secondary indexes, and global secondary indexes) for each request.
* Restoring from cold backup storage is supported for on-demand backups that are managed by AWS Backup only. You can opt-in to use AWS Backup from the AWS Management Console. Cold backup storage is not applicable for continuous backups with point-in-time recovery (PITR).
-
• Global tables
DynamoDB charges for global tables usage based on the resources used on each replica table. Write requests for global tables are measured in replicated WCUs instead of standard WCUs. The number of replicated WCUs consumed for replication depends on the version of global tables you are using. For more information, see Best Practices and Requirements for Managing Global Tables. The pricing depends on your table class. Read requests and data storage are billed consistently with tables that are not global tables. If you add a table replica to create or extend a global table in new Regions, DynamoDB charges for a table restore in the added Regions per gigabyte of data restored. Cross-Region replication and adding replicas to tables that contain data also incur charges for data transfer out. See the "Data transfer" section on this pricing page for details.
-
• Change data capture for Amazon Kinesis Data Streams
DynamoDB charges for change data capture for Amazon Kinesis Data Streams in change data capture units. DynamoDB charges one change data capture unit for each write (up to 1 KB). You pay only for the writes your application performs without having to manage throughput capacity on the table.
Kinesis Data Streams charges still apply when you replicate DynamoDB changes to a Kinesis data stream. For more information, see Amazon Kinesis Data Streams pricing.
-
• Change data capture for AWS Glue
DynamoDB charges for change data capture for AWS Glue in change data capture units. DynamoDB charges one change data capture unit for each write (up to 1 KB). You pay only for the writes your application performs without having to manage throughput capacity on your table.
AWS Glue charges still apply when you replicate DynamoDB changes to an AWS Glue target database. For more information, see AWS Glue pricing.
-
• Data export to Amazon S3
Use this feature to export data from your DynamoDB continuous backups (point-in-time recovery) to Amazon Simple Storage Service (Amazon S3). The supported output data formats are DynamoDB JSON and Amazon Ion. You can analyze the exported data by using AWS services such as Amazon Athena, Amazon SageMaker, and AWS Lake Formation.
You can choose between a full export and an incremental export. Full exports are charged based on the size of each DynamoDB table (table data and local secondary indexes) at the specified point in time when the backup was created. Incremental exports are charged based on the size of data processed from continuous backups to generate the incremental export output. Additional charges apply for storing exported data in Amazon S3 and for PUT requests made against your Amazon S3 bucket. For more information about these charges, see Amazon S3 pricing.
-
• Data import from Amazon S3
Amazon DynamoDB data import provides a simple and efficient way to move data between Amazon S3 and DynamoDB tables without writing any code. You can copy tables between AWS regions and accounts to help migrate data and build new applications, facilitate data sharing and collaboration between teams, and help simplify disaster recovery and business continuity planning. Data import pricing is based on the uncompressed file size in Amazon S3. See Import from S3 for more details. The supported input data formats are CSV, DynamoDB JSON, and Amazon Ion.
Amazon S3 charges also apply for storing your source data and for GET requests made against your Amazon S3 bucket. For more information about Amazon S3 charges, see Amazon S3 pricing.
-
• Integration with DynamoDB Accelerator (DAX)
DAX is an an Amazon DynamoDB-compatible caching service. DynamoDB charges for DAX capacity by the hour and your DAX instances run with no long-term commitments. Pricing is per node-hour consumed and is dependent on the instance type you select. Each partial node-hour consumed is billed as a full hour. Pricing applies to all individual nodes in the DAX cluster. For example, if you have a three-node DAX cluster, you are billed for each of the separate nodes (three nodes in total) on an hourly basis.
There is no charge for data transfer between Amazon Elastic Compute Cloud (Amazon EC2) and DAX within the same Availability Zone. Standard Amazon EC2 data transfer charges apply when transferring data between an Amazon EC2 instance and a DAX node in different Availability Zones of the same AWS Region. However, you are charged only for the data transfer into or out of the Amazon EC2 instance. There is no DAX data transfer charge for traffic into or out of the DAX node itself.
-
• DynamoDB Streams
DynamoDB charges for reading data from DynamoDB Streams in read request units. Each GetRecords API call is billed as a streams read request unit and returns up to 1 MB of data from DynamoDB Streams. Streams read request units are unique from read requests on your DynamoDB table. You are not charged for GetRecords API calls invoked by AWS Lambda as part of DynamoDB triggers. You also are not charged for GetRecords API calls invoked by DynamoDB global tables.
-
• Data transfer
Data transfer in and out refers to transfer into and out of DynamoDB. DynamoDB does not charge for inbound data transfer, and it does not charge for data transferred between DynamoDB and other AWS services within the same AWS Region (in other words, $0.00 per GB). Data transferred across AWS Regions (such as between DynamoDB in the US East [N. Virginia] Region and Amazon EC2 in the EU [Ireland] Region) is charged on both sides of the transfer. As part of the AWS Free Tier, you receive 1 GB of free data transfer out each month, aggregated across all AWS services except in the AWS GovCloud (US) Region. For more information, see the AWS Free Tier. To transfer data exceeding 500 TB per month, contact us.
DynamoDB free tier
The AWS Free Tier enables you to gain free, hands-on experience with AWS services. The following DynamoDB benefits are included as part of the AWS Free Tier. Each benefit is calculated monthly on a per-Region, per-payer account basis.
- 25 WCUs and 25 RCUs of provisioned capacity for tables using the DynamoDB Standard table class
- 25 GB of data storage for tables using the DynamoDB Standard table class
- 25 rWCUs for global tables using the DynamoDB Standard table class deployed in two AWS Regions
- 2.5 million stream read requests from DynamoDB Streams
- 1 GB of data transfer out (15 GB for your first 12 months), aggregated across AWS services
DynamoDB pricing examples
-
• Basic example
This example demonstrates how pricing is calculated for an auto scaling–enabled table with the provisioned capacity mode. Auto scaling continuously sets provisioned capacity in response to actual consumed capacity so that actual utilization stays near target utilization.
Assume that you create a new DynamoDB Standard table in the US East (N. Virginia) Region with target utilization set to the default value of 70 percent, minimum capacity units at 100 RCUs and 100 WCUs, and maximum capacity set to 400 RCUs and 400 WCUs (see Limits in DynamoDB). For simplicity, assume that each time a user interacts with your application, one write of 1 KB and one strongly consistent read of 1 KB are performed.
For the first 10 days, assume that the consumed RCUs and WCUs vary between 1 and 70. Auto scaling does not trigger any scaling activities and your bill per hour is $0.078 ($0.065 for the 100 WCUs provisioned [$0.00065 * 100] and $0.013 for the 100 RCUs [$0.00013 * 100]).
Now assume that on day 11 the consumed capacity increases to 100 RCUs and 100 WCUs. Auto scaling starts triggering scale-up activities to increase the provisioned capacity to 143 WCUs and 143 RCUS (100 consumed ÷ 143 provisioned = 69.9 percent). The per-hour bill is $0.11109 ($0.0925 for 143 WCUs and $0.01859 for 143 RCUs).
On day 21, assume the consumed capacity decreases to 80 RCUs and 80 WCUs. Auto scaling starts triggering scale-down activities to decrease provisioned capacity to 114 WCUs and 114 RCUs (80 consumed ÷ 114 provisioned = 70.2 percent). The per-hour bill is $0.08952 ($0.0741 for 114 WCUs and $0.01482 for 114 RCUs).
For the month, you will be charged $66.86 as follows:
Days 1 – 10: $18.72 ($0.078 per hour x 24 hours x 10 days)
Days 11 – 20: $26.66 ($0.11109 per hour x 24 hours x 10 days)
Days 21 – 30: $21.48 ($0.08952 per hour x 24 hours x 10 days)
The AWS Free Tier includes 25 WCUs and 25 RCUs for tables using the DynamoDB Standard table class, reducing your monthly bill by $14.04.
25 WCU x $0.00065 per hour x 24 hours x 30 days = $11.70
25 RCU x $0.00013 per hour x 24 hours x 30 days = $2.34
Data storage: Assume your table occupies 25 GB of storage at the beginning of the month and grows to 29 GB by the end of the month, averaging 27 GB based on the continuous monitoring of your table size. Since your table class is set to DynamoDB Standard, the first 25 GB of storage are included in the AWS Free Tier. The remaining 2 GB of storage are charged at $0.25 per GB, resulting in a table storage cost of $0.50 for the month.
For the month, your total bill will be $53.32, a total that includes $52.82 for read and write capacity and $0.50 for data storage.
-
• Detailed example
This example demonstrates how pricing is calculated for an auto scaling–enabled table with provisioned capacity mode. Auto scaling continuously sets provisioned capacity in response to actual consumed capacity so that actual utilization stays near target utilization.
Assume you create a new table in the US East (N. Virginia) Region with target utilization set to the default value of 70 percent, minimum capacity units at 100 RCUs and 100 WCUs, and maximum capacity set to 400 RCUs and 400 WCUs (see Limits in DynamoDB). Auto scaling operates with these limits, not scaling down provisioned capacity below the minimum or scaling up provisioned capacity above the maximum. When the table is created, auto scaling starts by provisioning the minimum capacity units. For simplicity, assume that each time a user interacts with your application, 1 write of 1 KB and 1 strongly consistent read of 1 KB are performed.
In the first hour after table creation, assume that the consumed RCUs and WCUs vary between 1 and 70. The actual utilization correspondingly varies between 1 percent (1 consumed ÷ 100 provisioned) and 70 percent (70 consumed ÷ 100 provisioned), within the target utilization of 70 percent. Auto scaling does not trigger any scaling activities and your bill for the hour is $0.078 ($0.065 for the 100 WCUs provisioned [$0.00065 * 100] and $0.013 for the 100 RCUs [$0.00013 * 100]).
During the second hour, assume the consumed capacity increases to 100 RCUs and 100 WCUs, which results in an actual utilization increase to 100 percent (100 consumed ÷ 100 provisioned), well above the target utilization of 70 percent. Auto scaling starts triggering scale-up activities to increase the provisioned capacity to bring actual utilization closer to the target of 70 percent. The result is a provisioned capacity of 143 WCUs and 143 RCUs (100 consumed ÷ 143 provisioned = 69.9 percent). The bill for this second hour is $0.1154 ($0.09295 for 143 WCUs and $0.01859 for 143 RCUs).
During the third hour, assume the consumed capacity decreases to 80 RCUs and 80 WCUs, which results in an actual utilization decrease to 56 percent (80 consumed ÷ 143 provisioned), well below the target utilization of 70 percent. Auto scaling starts triggering scale-down activities to decrease provisioned capacity to bring actual utilization closer to the target of 70 percent, resulting in provisioned capacity of 114 WCUs and 114 RCUs (80 consumed ÷ 114 provisioned = 70.2 percent). The bill for this third hour is $0.08892 ($0.0741 for 114 WCUs and $0.01482 for 114 RCUs).
For simplicity, assume that your consumed capacity remains constant at 80 RCUs and 80 WCUs. Your table also remains provisioned for 114 WCUs and 114 RCUs, with a daily charge of $2.1341, broken out as:
114 WCUs x $0.00065 per hour x 24 hours = $1.7784
114 RCUs x $0.00013 per hour x 24 hours = $0.3557
For the month, you are charged $64.04:
Day 1 total: $2.14578 per day
Hour 1: $0.078 per hour
Hour 2: $0.1154 per hour
Hours 3-24: $0.08892 per hour
Days 2-30: $2.1341 per day
The AWS Free Tier includes 25 WCUs and 25 RCUs for tables using the DynamoDB Standard table class, reducing your monthly bill by $14.04:
25 WCUs x $0.00065 per hour x 24 hours x 30 days = $11.70
25 RCUs x $0.00013 per hour x 24 hours x 30 days = $2.34
Data storage: Assume your table occupies 25 GB of storage at the beginning of the month and grows to 29 GB by the end of the month, averaging 27 GB based on the continuous monitoring of your table size. Since your table class is set to DynamoDB Standard, the first 25 GB of storage are included in the AWS Free Tier. The remaining 2 GB of storage are charged at $0.25 per GB, resulting in a table storage cost of $0.50 for the month.
Backup and restore: If the sum of all your on-demand backup storage is 60 GB for a 30-day month, the monthly cost of your backups is ($0.10 x 60 GB) = $6.00/month. However, if you then delete 15 GB of your on-demand backup data 10 days into the monthly cycle, you are billed ($0.10 x 60 GB) - ($0.10 x 15 GB x 20/30) = $5.00/month.
Now assume that in addition to performing on-demand backups, you use continuous backups. The size of your table is 29 GB, resulting in a monthly cost of ($0.20 x 29 GB) = $5.80/month.
If you need to restore your 29 GB table once during the month, that restore costs ($0.15 x 29 GB) = $4.35.
Change data capture for Kinesis Data Streams: Now assume you enable streaming to a Kinesis data stream to process your data changes using Amazon Kinesis services. Also assume that your write throughput is consistent with the previous example. Your application performs 80 writes of 1 KB per second. DynamoDB charges one change data capture unit for each write of 1 KB it captures to the Kinesis data stream. Over the course of a month, this results in (80 x 3,600 x 24 x 30) = 207,360,000 change data capture units. Your monthly cost will be ($0.10 x 207,360,000/1,000,000) = $20.74.
Data export to Amazon S3: Let’s say you want to export table backups to Amazon S3 for analysis. If the size of your table at the specified point in time is 29 GB, the resulting export costs are: ($0.10 x 29 GB) = $2.90.
Integration with DynamoDB Accelerator (DAX): DAX is an Amazon DynamoDB-compatible caching service. You have determined that you need to accelerate the response time of your application and decide to use the DynamoDB Accelerator (DAX) service. You review the available hardware specifications and determine that a three-node cluster of the t2.small instance type suits your needs. You enable DAX on day 26. DynamoDB charges $0.12 per hour ($0.04 x 3 nodes), totaling $14.40 for the final 5 days in the month ($0.12 x 120 hours).
Global tables: Now assume you create a disaster recovery replica table in the US West (Oregon) Region. Assume that you add the replica in the US West (Oregon) Region when your table is 25 GB in size, resulting in $3.75 ($0.15 x 25 GB) of table restore charges. Adding this replica also generates 25 GB of data transfer, as detailed under the "Data transfer" section below. Also assume that your capacity needs are consistent with the previous example. Auto scaling continues to provision 114 WCUs and 114 RCUs for your application's throughput needs, but it now must also provision rWCUs for writing to both of your replica tables. Provisioned rWCUs equal the total number of rWCUs needed for application writes in both Regions. In this scenario, you now perform 80 writes per second to both the US East (N. Virginia) Region and the US West (Oregon) Region, resulting in a minimum provisioned capacity of 160 rWCUs (80 rWCUs in N. Virginia + 80 rWCUs in Oregon = 160 rWCUs). Auto scaling provisions 229 rWCUs (160 rWCUs/70%) to maintain actual utilization at 70 percent of provisioned capacity. For more information, see Best Practices and Requirements for Managing Global Tables. Your first 25 provisioned rWCUs provisioned each hour in each Region are included in the AWS Free Tier for tables using the DynamoDB Standard table class, resulting in an hourly charge of $0.174525, or $125.66 in a 30-day month. You also store an additional 27 GB of data in your replicated table in the US West (Oregon) Region. The first 25 GB of storage are included in the AWS Free Tier in each AWS Region for tables using the DynamoDB Standard table class. The remaining 2 GB of storage are charged at $0.25 per GB, resulting in additional table storage cost of $0.50 for the month.
DynamoDB Streams: Now assume you enable DynamoDB Streams and build your application to perform one read request per second against the streams data. Over the course of a month, this results in 2,592,000 streams read requests, of which the first 2,500,000 read requests are included in the AWS Free Tier. You pay only for the remaining 92,000 read requests, which are $0.02 per 100,000 read request units.
Data transfer: Because you are now transferring data between AWS Regions for your global tables implementation, DynamoDB charges for data transferred out of the Region, but it does not charge for inbound data transfer. Assuming a constant 80 writes per second of 1 KB each, you generate 80 KB per second in data transfer between Regions, resulting in 198 GB (80 KB per second x 2,592,000 seconds in a 30-day month) of cross-Region data transfer per month. Adding the replica in the US West (Oregon) Region generates an additional 25 GB of data transfer. If you have already used your AWS Free Tier data transfer allowance on other AWS services, you will be charged $20.07 ($0.09 x [198 GB + 25 GB]) for data transfer.
In summary, your total monthly charges for a single-Region DynamoDB table are:
- Provisioned capacity: $50.00
- Data storage: $0.50
- On-demand backup: $5.00
- Continuous (PITR) backup: $5.80
- Table restore: $4.35
- Change data capture for Kinesis Data Streams: $20.74
- Data export to Amazon S3: $2.90
- Integration with DynamoDB Accelerator (DAX), an Amazon DynamoDB-compatible caching service: $14.40
- DynamoDB Streams: $0.02
Total charges: $103.71
Your total monthly DynamoDB charges after adding the US West (Oregon) Region are:
- Provisioned read capacity: $10.68
- Data storage (N. Virginia): $0.50
- On-demand backup: $5.00
- Continuous (PITR) backup: $5.80
- Table restore (N. Virginia): $4.35
- Change data capture for Kinesis Data Streams: $20.74
- Data export to Amazon S3: $2.90
- Integration with DynamoDB Accelerator (DAX), an Amazon DynamoDB-compatible caching service: $14.40
- DynamoDB Streams: $0.02
- Global tables table restore (Oregon): $3.75
- Global tables replicated write capacity: $125.66
- Global tables data storage (Oregon): $0.50
- Data transfer: $20.07
Total charges: $214.38
-
• Example using different table classes
In this example, we will demonstrate how you can reduce your table’s monthly charges by choosing the DynamoDB table class that best suits your table’s storage and data access patterns.
Assume you have a table in the US East (N. Virginia) Region. Your table already occupies 1 TB of historical data. The data is not frequently accessed but needs to be immediately available to your users when needed. Now, assume your data storage grows to 1.4 TB by the end of the month, averaging 1.2 TB based on continuous monitoring of your table size. Your table has a steady predictable traffic pattern. You hence provisioned it to 160 WCUs and 160 RCUs knowing that the utilization will not exceed 70 percent of the provisioned capacity within the month.
We will start by estimating your table’s monthly charges using the DynamoDB Standard table class.
Monthly charges using DynamoDB Standard table class
Setting your table class to DynamoDB Standard, you will be billed as follows.
Data Storage: Using the DynamoDB Standard table class, the first 25 GB of storage are included in the AWS Free Tier. The remaining 1.175 TB of storage are charged at $0.25 per GB, resulting in a table storage cost of $293.75 for the month.
Provisioned capacity: The AWS Free Tier includes 25 WCUs and 25 RCUs for tables using the DynamoDB Standard table class. You will be charged for:
135 WCUs x $0.00065 per hour x 24 hours x 10 days = $63.18 for the provisioned write capacity,
135 RCUs x $0.00013 per hour x 24 hours x 10 days = $12.63 for the provisioned read capacity.
In summary, your total monthly charges using DynamoDB Standard table class are:
- Provisioned capacity: $75.82
- Data storage: $293.75
Your total monthly charges using the DynamoDB Standard table class are $369.57.
Monthly charges using DynamoDB Standard-IA table class
As shown previously, when using the DynamoDB Standard table class, the storage cost is greater than 50 percent of the provisioned capacity cost. When storage is the dominant cost (greater than 50 percent of provisioned capacity cost) using DynamoDB Standard table class, you can optimize for cost by switching to the DynamoDB Standard-IA table class. Given the same workload, now assume that you switched the table class to DynamoDB Standard-IA at the beginning of the next month. You will be billed as follows.
Data storage: The 1.2 TB of storage are charged at $0.10 per GB, resulting in a table storage cost of $120.00 for the month.
Provisioned capacity: You will be charged for
160 WCUs x $0.00081 per hour x 24 hours x 10 days = $92.16 for the provisioned write capacity,
160 RCUs x $0.00016 per hour x 24 hours x 10 days = $18.43 for the provisioned read capacity.In summary, your total monthly charges using DynamoDB Standard-IA table class are:
- Provisioned capacity: $110.59
- Data storage: $120.00
Your total monthly charges using DynamoDB Standard-IA table class are $230.59. Switching your table to DynamoDB Standard-IA reduced your table’s total monthly charges by 37.6%, or $138.97.
Additional pricing resources
Easily calculate your monthly costs with AWS
Contact AWS specialists to get a personalized quote