Skip to main content

Limitations of S3

Last update:

We recommend loading objects larger than 100MB using segmented loading via the S3 API or Swift API.

Storage limits и vault limits you set on your own.

Limitations of S3 entities

Maximum number of buckets in the project2 000 *
Maximum number of buckets with Virtual-Hosted addressing in a project100 (within 2,000)
Maximum size of the name of the baquette128 UTF-8 characters
Maximum size of the access policy20 KB.
Maximum number of objects in a packageNot limited

* The first 1000 bucks are displayed in the control panel. If there are more baquettes, you can view the full list via API.

Limitations of API requests

The limits are for a single buck.If your load profile exceeds these limits, we recommend distributing the load among multiple buckets at a rate of no more than 250 requests per second (RPS) per buck.

If the limit is exceeded, the server will return a response with the code 429 Too Many Requests and the message Rate Limit exceeded.

Type of requestsLimit (requests per second)Message from the server
All inquiries2 000Too many authorized requests
GET + HEAD1 000Too many GET requests
POST200Too many POST requests
PUT300Too many PUT requests
DELETE300Too many DELETE requests