diff --git a/docs/backends/amazon-S3.rst b/docs/backends/amazon-S3.rst index ee52e976..9a5792b8 100644 --- a/docs/backends/amazon-S3.rst +++ b/docs/backends/amazon-S3.rst @@ -11,201 +11,235 @@ we always recommend the most recent. Either add it to your requirements or use t pip install django-storages[s3] -Settings --------- - -To upload your media files to S3 set:: +Configuration & Settings +------------------------ - # django < 4.2 - DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage' +Django 4.2 changed the way file storage objects are configured. In particular, it made it easier to independently configure +storage backends and add additional ones. To configure multiple storage objects pre Django 4.2 required subclassing the backend +because the settings were global, now you pass them under the key ``OPTIONS``. For example, to save media files to S3 on Django +>= 4.2 you'd define:: - # django >= 4.2 - STORAGES = {"default": {"BACKEND": "storages.backends.s3boto3.S3Boto3Storage"}} + STORAGES = { + "default": { + "BACKEND": "storages.backends.s3boto3.S3Boto3Storage", + "OPTIONS": { + ...your_options_here + }, + }, + } -To allow ``django-admin collectstatic`` to automatically put your static files in your bucket set the following in your settings.py:: +On Django < 4.2 you'd instead define:: - # django < 4.2 - STATICFILES_STORAGE = 'storages.backends.s3boto3.S3StaticStorage' + DEFAULT_FILE_STORAGE = "storages.backends.s3boto3.S3Boto3Storage" - # django >= 4.2 - STORAGES = {"staticfiles": {"BACKEND": "storages.backends.s3boto3.S3StaticStorage"}} +To put static files on S3 via ``collectstatic`` on Django >= 4.2 you'd include the ``staticfiles`` key (at the same level as +``default`` above inside of the ``STORAGES`` dictionary while on Django < 4.2 you'd instead define:: -If you want to use something like `ManifestStaticFilesStorage`_ then you must instead use:: + STATICFILES_STORAGE = "storages.backends.s3boto3.S3StaticStorage" - # django < 4.2 - STATICFILES_STORAGE = 'storages.backends.s3boto3.S3ManifestStaticStorage' +The settings documented in the following sections include both the key for ``OPTIONS`` (and subclassing) as +well as the global value. Given the significant improvements provided by the new API, migration is strongly encouraged. - # django >= 4.2 - STORAGES = {"staticfiles": {"BACKEND": "storages.backends.s3boto3.S3ManifestStaticStorage"}} +Authentication Settings +~~~~~~~~~~~~~~~~~~~~~~~ There are several different methods for specifying the AWS credentials used to create the S3 client. In the order that ``S3Boto3Storage`` searches for them: -#. ``AWS_S3_SESSION_PROFILE`` -#. ``AWS_S3_ACCESS_KEY_ID`` and ``AWS_S3_SECRET_ACCESS_KEY`` -#. ``AWS_ACCESS_KEY_ID`` and ``AWS_SECRET_ACCESS_KEY`` +#. ``session_profile`` or ``AWS_S3_SESSION_PROFILE`` +#. ``access_key`` or ``AWS_S3_ACCESS_KEY_ID`` or ``AWS_S3_SECRET_ACCESS_KEY`` +#. ``secret_key`` or ``AWS_ACCESS_KEY_ID`` or ``AWS_SECRET_ACCESS_KEY`` #. The environment variables AWS_S3_ACCESS_KEY_ID and AWS_S3_SECRET_ACCESS_KEY #. The environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY #. Use Boto3's default session -``AWS_S3_SESSION_PROFILE`` - The AWS profile to use instead of ``AWS_ACCESS_KEY_ID`` and ``AWS_SECRET_ACCESS_KEY``. All configuration information - other than the key id and secret key is ignored in favor of the other settings specified below. +Settings +~~~~~~~~ + +``bucket_name`` or ``AWS_STORAGE_BUCKET_NAME`` -.. note:: - If this is set, then it is a configuration error to also set ``AWS_S3_ACCESS_KEY_ID`` and ``AWS_S3_SECRET_ACCESS_KEY``. - ``AWS_ACCESS_KEY_ID`` and ``AWS_SECRET_ACCESS_KEY`` are ignored + **Required** -``AWS_S3_ACCESS_KEY_ID or AWS_ACCESS_KEY_ID`` - Your Amazon Web Services access key, as a string. + The name of the S3 bucket that will host the files. -``AWS_S3_SECRET_ACCESS_KEY or AWS_SECRET_ACCESS_KEY`` - Your Amazon Web Services secret access key, as a string. +``object_parameters`` or ``AWS_S3_OBJECT_PARAMETERS`` -``AWS_STORAGE_BUCKET_NAME`` - Your Amazon Web Services storage bucket name, as a string. + Default: ``{}`` -``AWS_S3_OBJECT_PARAMETERS`` (optional, default ``{}``) Use this to set parameters on all objects. To set these on a per-object basis, subclass the backend and override ``S3Boto3Storage.get_object_parameters``. To view a full list of possible parameters (there are many) see the `Boto3 docs for uploading files`_; an incomplete list includes: ``CacheControl``, ``SSEKMSKeyId``, ``StorageClass``, ``Tagging`` and ``Metadata``. -``AWS_DEFAULT_ACL`` (optional; default is ``None`` which means the file will be ``private`` per Amazon's default) +``default_acl`` or ``AWS_DEFAULT_ACL`` - Use this to set an ACL on your file such as ``public-read``. If not set the file will be ``private`` per Amazon's default. - If the ``ACL`` parameter is set in ``AWS_S3_OBJECT_PARAMETERS``, then this setting is ignored. + Default: ``None`` - the file will be ``private`` per Amazon's default - Options such as ``public-read`` and ``private`` come from the `list of canned ACLs`_. + Use this to set an ACL on your file such as ``public-read``. If not set the file will be ``private`` per Amazon's default. + If the ``ACL`` parameter is set in ``object_parameters``, then this setting is ignored. -``AWS_QUERYSTRING_AUTH`` (optional; default is ``True``) - Setting ``AWS_QUERYSTRING_AUTH`` to ``False`` to remove query parameter - authentication from generated URLs. This can be useful if your S3 buckets - are public. + Options such as ``public-read`` and ``private`` come from the `list of canned ACLs`_. -``AWS_S3_MAX_MEMORY_SIZE`` (optional; default is ``0`` - do not roll over) - The maximum amount of memory (in bytes) a file can take up before being rolled over - into a temporary file on disk. +``querystring_auth`` or ``AWS_QUERYSTRING_AUTH`` -``AWS_QUERYSTRING_EXPIRE`` (optional; default is 3600 seconds) - The number of seconds that a generated URL is valid for. + Default: ``True`` -``AWS_S3_URL_PROTOCOL`` (optional: default is ``https:``) - The protocol to use when constructing a custom domain, ``AWS_S3_CUSTOM_DOMAIN`` must be ``True`` for this to have any effect. + Setting ``AWS_QUERYSTRING_AUTH`` to ``False`` to remove query parameter + authentication from generated URLs. This can be useful if your S3 buckets + are public. -``AWS_S3_FILE_OVERWRITE`` (optional: default is ``True``) - By default files with the same name will overwrite each other. Set this to ``False`` to have extra characters appended. +``max_memory_size`` or ``AWS_S3_MAX_MEMORY_SIZE`` -``AWS_LOCATION`` (optional: default is `''`) - A path prefix that will be prepended to all uploads + Default: ``0`` i.e do not roll over -``AWS_IS_GZIPPED`` (optional: default is ``False``) - Whether or not to enable gzipping of content types specified by ``GZIP_CONTENT_TYPES`` + The maximum amount of memory (in bytes) a file can take up before being rolled over + into a temporary file on disk. -``GZIP_CONTENT_TYPES`` (optional: default is ``text/css``, ``text/javascript``, ``application/javascript``, ``application/x-javascript``, ``image/svg+xml``) - When ``AWS_IS_GZIPPED`` is set to ``True`` the content types which will be gzipped +``querystring_expire`` or ``AWS_QUERYSTRING_EXPIRE`` -``AWS_S3_REGION_NAME`` (optional: default is ``None``) - Name of the AWS S3 region to use (eg. eu-west-1) + Default: ``3600`` -``AWS_S3_USE_SSL`` (optional: default is ``True``) - Whether or not to use SSL when connecting to S3, this is passed to the boto3 session resource constructor. + The number of seconds that a generated URL is valid for. -``AWS_S3_VERIFY`` (optional: default is ``None``) - Whether or not to verify the connection to S3. Can be set to False to not verify certificates or a path to a CA cert bundle. +``url_protocol`` or ``AWS_S3_URL_PROTOCOL`` -``AWS_S3_ENDPOINT_URL`` (optional: default is ``None``) - Custom S3 URL to use when connecting to S3, including scheme. Overrides ``AWS_S3_REGION_NAME`` and ``AWS_S3_USE_SSL``. To avoid ``AuthorizationQueryParametersError`` error, ``AWS_S3_REGION_NAME`` should also be set. + Default: ``https:`` -``AWS_S3_ADDRESSING_STYLE`` (optional: default is ``None``) - Possible values ``virtual`` and ``path``. + The protocol to use when constructing a custom domain, ``custom_domain`` must be ``True`` for this to have any effect. -``AWS_S3_PROXIES`` (optional: default is ``None``) - A dictionary of proxy servers to use by protocol or endpoint, e.g.: - {'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}. +``file_overwrite`` or ``AWS_S3_FILE_OVERWRITE`` -``AWS_S3_SIGNATURE_VERSION`` (optional) + Default: ``True`` - As of ``boto3`` version 1.13.21 the default signature version used for generating presigned - urls is still ``v2``. To be able to access your s3 objects in all regions through presigned - urls, explicitly set this to ``s3v4``. + By default files with the same name will overwrite each other. Set this to ``False`` to have extra characters appended. - Set this to use an alternate version such as ``s3``. Note that only certain regions - support the legacy ``s3`` (also known as ``v2``) version. You can check to see - if your region is one of them in the `S3 region list`_. +``location`` or ``AWS_LOCATION`` + + Default: ``''`` + + A path prefix that will be prepended to all uploads. + +``gzip`` or ``AWS_IS_GZIPPED`` + + Default: ``False`` + + Whether or not to enable gzipping of content types specified by ``gzip_content_types``. + +``gzip_content_types`` or ``GZIP_CONTENT_TYPES`` + + Default: ``(text/css,text/javascript,application/javascript,application/x-javascript,image/svg+xml)`` + + The list of content types to be gzipped when ``gzip`` is ``True``. + +``region_name`` or ``AWS_S3_REGION_NAME`` + + Default: ``None`` + + Name of the AWS S3 region to use (eg. eu-west-1) + +``use_ssl`` or ``AWS_S3_USE_SSL`` + + Default: ``True`` + + Whether or not to use SSL when connecting to S3, this is passed to the boto3 session resource constructor. + +``verify`` or ``AWS_S3_VERIFY`` + + Default: ``None`` + + Whether or not to verify the connection to S3. Can be set to False to not verify certificates or a path to a CA cert bundle. + +``endpoint_url`` or ``AWS_S3_ENDPOINT_URL`` -``AWS_S3_TRANSFER_CONFIG`` (optional, default is ``None``) + Default: ``None`` + + Custom S3 URL to use when connecting to S3, including scheme. Overrides ``region_name`` and ``use_ssl``. + To avoid ``AuthorizationQueryParametersError`` errors, ``region_name`` should also be set. + +``addressing_style`` or ``AWS_S3_ADDRESSING_STYLE`` + + Default: ``None`` + + Possible values ``virtual`` and ``path``. + +``proxies`` or ``AWS_S3_PROXIES`` + + Default: ``None`` + + Dictionary of proxy servers to use by protocol or endpoint, e.g.:: + + {'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}. + +``transfer_config`` or ``AWS_S3_TRANSFER_CONFIG`` + + Default: ``None`` Set this to customize the transfer config options such as disabling threads for ``gevent`` compatibility; See the `Boto3 docs for TransferConfig` for more info. -.. note:: - The signature versions are not backwards compatible so be careful about url endpoints if making this change - for legacy projects. +``custom_domain`` or ``AWS_S3_CUSTOM_DOMAIN`` -.. _AWS Signature Version 4: https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-query-string-auth.html -.. _S3 region list: http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region -.. _list of canned ACLs: https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl -.. _Boto3 docs for uploading files: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.put_object -.. _Boto3 docs for TransferConfig: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/customizations/s3.html#boto3.s3.transfer.TransferConfig -.. _ManifestStaticFilesStorage: https://docs.djangoproject.com/en/3.1/ref/contrib/staticfiles/#manifeststaticfilesstorage + Default: ``None`` -CloudFront ----------- + Set this to specify a custom domain for constructed URLs. -If you're using S3 as a CDN (via CloudFront), you'll probably want this storage -to serve those files using that:: + .. note:: + You'll have to configure CloudFront to use the bucket as an origin for this to work. - AWS_S3_CUSTOM_DOMAIN = 'cdn.mydomain.com' + .. warning:: -.. warning:: + Django’s STATIC_URL must end in a slash and this must not. It is best to set this variable independently of STATIC_URL. - Django's ``STATIC_URL`` `must end in a slash`_ and the ``AWS_S3_CUSTOM_DOMAIN`` *must not*. It is best to set this variable independently of ``STATIC_URL``. +``signature_version`` or ``AWS_S3_SIGNATURE_VERSION`` -.. _must end in a slash: https://docs.djangoproject.com/en/dev/ref/settings/#static-url + Default: ``None`` -Keep in mind you'll have to configure CloudFront to use the proper bucket as an -origin manually for this to work. + As of ``boto3`` version 1.13.21 the default signature version used for generating presigned + urls is still ``v2``. To be able to access your s3 objects in all regions through presigned + urls, explicitly set this to ``s3v4``. -If you need to use multiple storages that are served via CloudFront, pass the -`custom_domain` parameter to their constructors. + Set this to use an alternate version such as ``s3``. Note that only certain regions + support the legacy ``s3`` (also known as ``v2``) version. You can check to see + if your region is one of them in the `S3 region list`_. -CloudFront Signed Urls -^^^^^^^^^^^^^^^^^^^^^^ -If you want django-storages to generate Signed Cloudfront Urls, you can do so by following these steps: + .. warning:: -- modify `settings.py` to include:: + The signature versions are not backwards compatible so be careful about url endpoints if making this change + for legacy projects. - AWS_CLOUDFRONT_KEY = os.environ.get('AWS_CLOUDFRONT_KEY', None).encode('ascii') - AWS_CLOUDFRONT_KEY_ID = os.environ.get('AWS_CLOUDFRONT_KEY_ID', None) +.. _AWS Signature Version 4: https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-query-string-auth.html +.. _S3 region list: http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region +.. _list of canned ACLs: https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl +.. _Boto3 docs for uploading files: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.put_object +.. _Boto3 docs for TransferConfig: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/customizations/s3.html#boto3.s3.transfer.TransferConfig +.. _ManifestStaticFilesStorage: https://docs.djangoproject.com/en/3.1/ref/contrib/staticfiles/#manifeststaticfilesstorage -- Generate a CloudFront Key Pair as specified in the `AWS Doc to create CloudFront key pairs`_. +.. _cloudfront-signed-url-header: -- Updated ENV vars with the corresponding values:: +CloudFront Signed URLs +---------------------- - AWS_CLOUDFRONT_KEY=-----BEGIN RSA PRIVATE KEY----- - ... - -----END RSA PRIVATE KEY----- - AWS_CLOUDFRONT_KEY_ID=APK.... +If you want to generate signed Cloudfront URLs, you can do so by following these steps: -.. _AWS Doc to create CloudFront key pairs: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-trusted-signers.html#private-content-creating-cloudfront-key-pairs-procedure +#. Generate a CloudFront Key Pair as specified in the `AWS docs`_. +#. Add ``cloudfront_key`` and ``cloudfront_key_id`` as above with the generated settings +#. Install one of `cryptography`_ or `rsa`_ +#. Set both ``cloudfront_key_id/AWS_CLOUDFRONT_KEY_ID`` and ``cloudfront_key/AWS_CLOUDFRONT_KEY`` -django-storages will now generate `signed cloudfront urls`_ +django-storages will now generate `signed cloudfront urls`_. +.. _AWS docs: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-trusted-signers.html#private-content-creating-cloudfront-key-pairs-procedure .. _signed cloudfront urls: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-signed-urls.html -.. note:: - You must install one of `cryptography`_ or `rsa`_ to use signed URLs. - .. _cryptography: https://pypi.org/project/cryptography/ .. _rsa: https://pypi.org/project/rsa/ IAM Policy ---------- -The IAM policy permissions needed for most common use cases are: +The IAM policy definition needed for the most common use case is: .. code-block:: json @@ -238,197 +272,3 @@ The IAM policy permissions needed for most common use cases are: For more information about Principal, please refer to `AWS JSON Policy Elements`_ .. _AWS JSON Policy Elements: https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements_principal.html - -Storage -------- - -Standard file access options are available, and work as expected:: - - >>> from django.core.files.storage import default_storage - >>> default_storage.exists('storage_test') - False - >>> file = default_storage.open('storage_test', 'w') - >>> file.write('storage contents') - >>> file.close() - - >>> default_storage.exists('storage_test') - True - >>> file = default_storage.open('storage_test', 'r') - >>> file.read() - 'storage contents' - >>> file.close() - - >>> default_storage.delete('storage_test') - >>> default_storage.exists('storage_test') - False - - -Overriding the default Storage class -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -You can override the default Storage class and create your custom storage backend. Below provides some examples and common use cases to help you get started. This section assumes you have your AWS credentials configured, e.g. ``AWS_ACCESS_KEY_ID`` and ``AWS_SECRET_ACCESS_KEY``. - -To create a storage class using a specific bucket:: - - from storages.backends.s3boto3 import S3Boto3Storage - - class MediaStorage(S3Boto3Storage): - bucket_name = 'my-media-bucket' - - -Assume that you store the above class ``MediaStorage`` in a file called ``custom_storage.py`` in the project directory tree like below:: - - | (your django project root directory) - | ├── manage.py - | ├── my_django_app - | │ ├── custom_storage.py - | │ └── ... - | ├── ... - -You can now use your custom storage class for default file storage in Django settings like below:: - - # django < 4.2 - DEFAULT_FILE_STORAGE = 'my_django_app.custom_storage.MediaStorage' - - # django >= 4.2 - STORAGES = {"default": {"BACKEND": "my_django_app.custom_storage.MediaStorage"}} - -Or you may want to upload files to the bucket in some view that accepts file upload request:: - - import os - - from django.views import View - from django.http import JsonResponse - - from django_backend.custom_storages import MediaStorage - - class FileUploadView(View): - def post(self, requests, **kwargs): - file_obj = requests.FILES.get('file', '') - - # do your validation here e.g. file size/type check - - # organize a path for the file in bucket - file_directory_within_bucket = 'user_upload_files/{username}'.format(username=requests.user) - - # synthesize a full file path; note that we included the filename - file_path_within_bucket = os.path.join( - file_directory_within_bucket, - file_obj.name - ) - - media_storage = MediaStorage() - - if not media_storage.exists(file_path_within_bucket): # avoid overwriting existing file - media_storage.save(file_path_within_bucket, file_obj) - file_url = media_storage.url(file_path_within_bucket) - - return JsonResponse({ - 'message': 'OK', - 'fileUrl': file_url, - }) - else: - return JsonResponse({ - 'message': 'Error: file {filename} already exists at {file_directory} in bucket {bucket_name}'.format( - filename=file_obj.name, - file_directory=file_directory_within_bucket, - bucket_name=media_storage.bucket_name - ), - }, status=400) - -A side note is that if you have ``AWS_S3_CUSTOM_DOMAIN`` setup in your ``settings.py``, by default the storage class will always use ``AWS_S3_CUSTOM_DOMAIN`` to generate url. - -If your ``AWS_S3_CUSTOM_DOMAIN`` is pointing to a different bucket than your custom storage class, the ``.url()`` function will give you the wrong url. In such case, you will have to configure your storage class and explicitly specify ``custom_domain`` as below:: - - class MediaStorage(S3Boto3Storage): - bucket_name = 'my-media-bucket' - custom_domain = '{}.s3.amazonaws.com'.format(bucket_name) - -You can also decide to config your custom storage class to store files under a specific directory within the bucket:: - - class MediaStorage(S3Boto3Storage): - bucket_name = 'my-app-bucket' - location = 'media' # store files under directory `media/` in bucket `my-app-bucket` - -This is especially useful when you want to have multiple storage classes share the same bucket:: - - class MediaStorage(S3Boto3Storage): - bucket_name = 'my-app-bucket' - location = 'media' - - class StaticStorage(S3Boto3Storage): - bucket_name = 'my-app-bucket' - location = 'static' - -So your bucket file can be organized like as below:: - - | my-app-bucket - | ├── media - | │ ├── user_video.mp4 - | │ ├── user_file.pdf - | │ └── ... - | ├── static - | │ ├── app.js - | │ ├── app.css - | │ └── ... - - -Model ------ - -An object without a file has limited functionality:: - - from django.db import models - from django.core.files.base import ContentFile - - class MyModel(models.Model): - normal = models.FileField() - - >>> obj1 = MyModel() - >>> obj1.normal - - >>> obj1.normal.size - Traceback (most recent call last): - ... - ValueError: The 'normal' attribute has no file associated with it. - -Saving a file enables full functionality:: - - >>> obj1.normal.save('django_test.txt', ContentFile(b'content')) - >>> obj1.normal - - >>> obj1.normal.size - 7 - >>> obj1.normal.read() - 'content' - -Files can be read in a little at a time, if necessary:: - - >>> obj1.normal.open() - >>> obj1.normal.read(3) - 'con' - >>> obj1.normal.read() - 'tent' - >>> '-'.join(obj1.normal.chunks(chunk_size=2)) - 'co-nt-en-t' - -Save another file with the same name:: - - >>> obj2 = MyModel() - >>> obj2.normal.save('django_test.txt', ContentFile(b'more content')) - >>> obj2.normal - - >>> obj2.normal.size - 12 - -Push the objects into the cache to make sure they pickle properly:: - - >>> cache.set('obj1', obj1) - >>> cache.set('obj2', obj2) - >>> cache.get('obj2').normal - - -Clean up the temporary files:: - - >>> obj1.normal.delete() - >>> obj2.normal.delete() diff --git a/docs/backends/azure.rst b/docs/backends/azure.rst index 1acc5398..3445ce42 100644 --- a/docs/backends/azure.rst +++ b/docs/backends/azure.rst @@ -4,188 +4,172 @@ Azure Storage A custom storage system for Django using Microsoft Azure Storage backend. -Notes -***** +Installation +------------ -Be aware Azure file names have some extra restrictions. They can't: +Install Azure SDK:: - - end with a dot (``.``) or slash (``/``) - - contain more than 256 slashes (``/``) - - be longer than 1024 characters + pip install django-storages[azure] -This is usually not an issue, since some file-systems won't -allow this anyway. -There's ``default_storage.get_name_max_len()`` method -to get the ``max_length`` allowed. This is useful -for form inputs. It usually returns -``1024 - len(azure_location_setting)``. -There's ``default_storage.get_valid_name(...)`` method -to clean up file names when migrating to Azure. +Configuration & Settings +------------------------ -Gzipping for static files must be done through Azure CDN. +Django 4.2 changed the way file storage objects are configured. In particular, it made it easier to independently configure +storage backends and add additional ones. To configure multiple storage objects pre Django 4.2 required subclassing the backend +because the settings were global, now you pass them under the key ``OPTIONS``. For example, to save media files to Azure on Django +>= 4.2 you'd define:: -Install -******* + STORAGES = { + "default": { + "BACKEND": "storages.backends.azure_storage.AzureStorage", + "OPTIONS": { + ...your_options_here + }, + }, + } -Install Azure SDK:: +On Django < 4.2 you'd instead define:: - pip install django-storages[azure] + DEFAULT_FILE_STORAGE = "storages.backends.azure_storage.AzureStorage" +To put static files on Azure via ``collectstatic`` on Django >= 4.2 you'd include the ``staticfiles`` key (at the same level as +``default`` above inside of the ``STORAGES`` dictionary while on Django < 4.2 you'd instead define:: -Private VS Public Access -************************ + STATICFILES_STORAGE = "storages.backends.azure_storage.AzureStorage" -The ``AzureStorage`` allows a single container. The container may have either -public access or private access. When dealing with a private container, the -``AZURE_URL_EXPIRATION_SECS`` must be set to get temporary URLs. +The settings documented in the following sections include both the key for ``OPTIONS`` (and subclassing) as +well as the global value. Given the significant improvements provided by the new API, migration is strongly encouraged. -A common setup is having private media files and public static files, -since public files allow for better caching (i.e: no query-string within the URL). +Authentication Settings +~~~~~~~~~~~~~~~~~~~~~~~ -One way to support this is having two backends, a regular ``AzureStorage`` -with the private container and expiration setting set, and a custom -backend (i.e: a subclass of ``AzureStorage``) for the public container. +Several different methods of authentication are provided. In order of precedence they are: -Custom backend:: +#. ``connection_string`` or ``AZURE_CONNECTION_STRING`` (see `Connection string docs `_) +#. (``account_key`` or ``AZURE_ACCOUNT_KEY``) and (``account_name`` or ``AZURE_ACCOUNT_NAME``) +#. ``token_credential`` or ``AZURE_TOKEN_CREDENTIAL`` +#. ``sas_token`` or ``AZURE_SAS_TOKEN`` - # file: ./custom_storage/custom_azure.py - class PublicAzureStorage(AzureStorage): - account_name = 'myaccount' - account_key = 'mykey' - azure_container = 'mypublic_container' - expiration_secs = None +Settings +~~~~~~~~ -Then on settings set:: +``azure_container`` or ``AZURE_CONTAINER`` - # django < 4.2 - DEFAULT_FILE_STORAGE = 'storages.backends.azure_storage.AzureStorage' - STATICFILES_STORAGE = 'custom_storage.custom_azure.PublicAzureStorage' + **Required** - # django >= 4.2 - STORAGES = { - "default": {"BACKEND": "storages.backends.azure_storage.AzureStorage"}, - "staticfiles": {"BACKEND": "custom_storage.custom_azure.PublicAzureStorage"}, - } + This is where the files uploaded through Django will be uploaded. + The container must be already created, since the storage system will not attempt to create it. -+++++++++++++++++++++ -Private VS Public URL -+++++++++++++++++++++ +``azure_ssl`` or ``AZURE_SSL`` -The difference between public and private URLs is that private includes the SAS token. -With private URLs you can override certain properties stored for the blob by specifying -query parameters as part of the shared access signature. These properties include the -cache-control, content-type, content-encoding, content-language, and content-disposition. -See https://docs.microsoft.com/en-us/rest/api/storageservices/set-blob-properties#remarks + Default: ``True`` -You can specify these parameters by:: + Set a secure connection (HTTPS), otherwise it makes an insecure connection (HTTP). - az_storage = AzureStorage() - az_url = az_storage.url(blob_name, parameters={'content_type': 'text/html;'}) +``upload_max_conn`` or ``AZURE_UPLOAD_MAX_CONN`` + Default: ``2`` -Settings -******** + Number of connections to make when uploading a single file. -The following settings should be set within the standard Django -configuration file, usually `settings.py`. +``timeout`` or ``AZURE_CONNECTION_TIMEOUT_SECS`` -Set the default storage (i.e: for media files) and the static storage -(i.e: for static files) to use the Azure backend:: + Default: ``20`` - # django < 4.2 - DEFAULT_FILE_STORAGE = 'storages.backends.azure_storage.AzureStorage' - STATICFILES_STORAGE = 'storages.backends.azure_storage.AzureStorage' + Global connection timeout in seconds. - # django >= 4.2 - STORAGES = { - "default": {"BACKEND": "storages.backends.azure_storage.AzureStorage"}, - "staticfiles": {"BACKEND": "storages.backends.azure_storage.AzureStorage"}, - } +``max_memory`` size ``AZURE_BLOB_MAX_MEMORY_SIZE`` -The following settings are available: + Default: ``2*1024*1024`` i.e ``2MB`` -``AZURE_ACCOUNT_NAME`` + Maximum memory used by a downloaded file before dumping it to disk in bytes. - This setting is the Windows Azure Storage Account name, which in many cases - is also the first part of the url for instance: http://azure_account_name.blob.core.windows.net/ - would mean:: +``expiration_secs`` or ``AZURE_URL_EXPIRATION_SECS`` - AZURE_ACCOUNT_NAME = "azure_account_name" + Default: ``None`` -``AZURE_ACCOUNT_KEY`` + Seconds before a URL expires, set to ``None`` to never expire it. + Be aware the container must have public read permissions in order + to access a URL without expiration date. - This is the private key that gives Django access to the Windows Azure Account. +``overwrite_files`` or ``AZURE_OVERWRITE_FILES`` -``AZURE_CONTAINER`` + Default: ``False`` - This is where the files uploaded through Django will be uploaded. - The container must be already created, since the storage system will not attempt to create it. + Whether or not to overwrite a file previously uploaded with the same name. If not, random character are appended. -``AZURE_SSL`` +``location`` or ``AZURE_LOCATION`` - Set a secure connection (HTTPS), otherwise it makes an insecure connection (HTTP). Default is ``True`` + Default: ``''`` -``AZURE_UPLOAD_MAX_CONN`` + Default location for the uploaded files. This is a path that gets prepended to every file name. - Number of connections to make when uploading a single file. Default is ``2`` +``endpoint_suffix`` or ``AZURE_ENDPOINT_SUFFIX`` -``AZURE_CONNECTION_TIMEOUT_SECS`` + Default: ``core.windows.net`` - Global connection timeout in seconds. Default is ``20`` + Use ``core.chinacloudapi.cn`` for azure.cn accounts. -``AZURE_BLOB_MAX_MEMORY_SIZE`` +``custom_domain`` or ``AZURE_CUSTOM_DOMAIN`` - Maximum memory used by a downloaded file before dumping it to disk. Unit is in bytes. Default is ``2MB`` + Default: ``None`` -``AZURE_URL_EXPIRATION_SECS`` + The custom domain to use for generating URLs for files. For + example, ``www.mydomain.com`` or ``mycdn.azureedge.net``. - Seconds before a URL expires, set to ``None`` to never expire it. - Be aware the container must have public read permissions in order - to access a URL without expiration date. Default is ``None`` +``AZURE_TOKEN_CREDENTIAL`` -``AZURE_OVERWRITE_FILES`` + A token credential used to authenticate HTTPS requests. The token value + should be updated before its expiration. - Overwrite an existing file when it has the same name as the file being uploaded. - Otherwise, rename it. Default is ``False`` -``AZURE_LOCATION`` +``cache_control`` or ``AZURE_CACHE_CONTROL`` - Default location for the uploaded files. This is a path that gets prepended to every file name. + Default: ``None`` -``AZURE_ENDPOINT_SUFFIX`` + A variable to set the Cache-Control HTTP response header. E.g.:: - Defaults to ``core.windows.net``. Use ``core.chinacloudapi.cn`` for azure.cn accounts. + cache_control: "public,max-age=31536000,immutable" -``AZURE_CUSTOM_DOMAIN`` +``object_parameters`` or ``AZURE_OBJECT_PARAMETERS`` - The custom domain to use for generating URLs for files. For - example, ``www.mydomain.com`` or ``mycdn.azureedge.net``. + Default: ``{}`` -``AZURE_CONNECTION_STRING`` + Use this to set content settings on all objects. To set these on a per-object + basis, subclass the backend and override ``AzureStorage.get_object_parameters``. - If specified, this will override all other parameters. - See http://azure.microsoft.com/en-us/documentation/articles/storage-configure-connection-string/ - for the connection string format. + This is a Python ``dict`` and the possible parameters are: ``content_type``, ``content_encoding``, ``content_language``, ``content_disposition``, ``cache_control``, and ``content_md5``. -``AZURE_TOKEN_CREDENTIAL`` +``api_version`` or ``AZURE_API_VERSION`` - A token credential used to authenticate HTTPS requests. The token value - should be updated before its expiration. + Default: ``None`` + The api version to use. -``AZURE_CACHE_CONTROL`` - A variable to set the Cache-Control HTTP response header. E.g. - ``AZURE_CACHE_CONTROL = "public,max-age=31536000,immutable"`` +Additional Notes +---------------- -``AZURE_OBJECT_PARAMETERS`` +Filename Restrictions +~~~~~~~~~~~~~~~~~~~~~ - Use this to set content settings on all objects. To set these on a per-object - basis, subclass the backend and override ``AzureStorage.get_object_parameters``. - - This is a Python ``dict`` and the possible parameters are: ``content_type``, ``content_encoding``, ``content_language``, ``content_disposition``, ``cache_control``, and ``content_md5``. +Azure file names have some extra restrictions. They can't: -``AZURE_API_VERSION`` +- end with a dot (``.``) or slash (``/``) +- contain more than 256 slashes (``/``) +- be longer than 1024 characters - The api version to use. The default value is ``None``. +Private vs Public URLs +~~~~~~~~~~~~~~~~~~~~~~ + +The difference between public and private URLs is that private includes the SAS token. +With private URLs you can override certain properties stored for the blob by specifying +query parameters as part of the shared access signature. These properties include the +cache-control, content-type, content-encoding, content-language, and content-disposition. +See https://docs.microsoft.com/en-us/rest/api/storageservices/set-blob-properties#remarks + +You can specify these parameters by:: + + az_storage = AzureStorage() + az_url = az_storage.url(blob_name, parameters={'content_type': 'text/html;'}) diff --git a/docs/backends/backblaze-B2.rst b/docs/backends/backblaze-B2.rst index 4d30281c..e57749f5 100644 --- a/docs/backends/backblaze-B2.rst +++ b/docs/backends/backblaze-B2.rst @@ -8,6 +8,6 @@ Backblaze B2 implements an `S3 Compatible API `_. Best practice is to limit access to the bucket you just created. #. Follow the instructions in the :doc:`Amazon S3 docs ` with the following exceptions: - * Set ``AWS_S3_REGION_NAME`` to your Backblaze B2 region, for example, ``us-west-004`` - * Set ``AWS_S3_ENDPOINT_URL`` to ``https://s3.${AWS_S3_REGION_NAME}.backblazeb2.com`` - * Set the values of ``AWS_ACCESS_KEY_ID`` and ``AWS_SECRET_ACCESS_KEY`` to the application key id and application key you created in step 2. + * Set ``region_name`` to your Backblaze B2 region, for example, ``us-west-004`` + * Set ``endpoint_url`` to ``https://s3.${AWS_S3_REGION_NAME}.backblazeb2.com`` + * Set the values of ``access_key`` and ``secret_key`` to the application key id and application key you created in step 2. diff --git a/docs/backends/digital-ocean-spaces.rst b/docs/backends/digital-ocean-spaces.rst index c11af4bf..9f9a22df 100644 --- a/docs/backends/digital-ocean-spaces.rst +++ b/docs/backends/digital-ocean-spaces.rst @@ -3,6 +3,6 @@ Digital Ocean Digital Ocean Spaces implements the S3 protocol. To use it follow the instructions in the :doc:`Amazon S3 docs ` with the important caveats that you must: -- Set ``AWS_S3_REGION_NAME`` to your Digital Ocean region (such as ``nyc3`` or ``sfo2``) -- Set ``AWS_S3_ENDPOINT_URL`` to the value of ``https://${AWS_S3_REGION_NAME}.digitaloceanspaces.com`` -- Set the values of ``AWS_ACCESS_KEY_ID`` and ``AWS_SECRET_ACCESS_KEY`` to the corresponding values from Digital Ocean +- Set ``region_name`` to your Digital Ocean region (such as ``nyc3`` or ``sfo2``) +- Set ``endpoint_url`` to the value of ``https://${region_name}.digitaloceanspaces.com`` +- Set the values of ``access_key`` and ``secret_key`` to the corresponding values from Digital Ocean diff --git a/docs/backends/dropbox.rst b/docs/backends/dropbox.rst index bbc1ffdb..fc944f7d 100644 --- a/docs/backends/dropbox.rst +++ b/docs/backends/dropbox.rst @@ -4,62 +4,67 @@ Dropbox A Django files storage using Dropbox as a backend via the official `Dropbox SDK for Python`_. Currently only v2 of the API is supported. +Installation +------------ + Before you start configuration, you will need to install the SDK which can be done for you automatically by doing:: pip install django-storages[dropbox] -Settings --------- - -To use DropboxStorage set:: +Configuration & Settings +------------------------ - # django < 4.2 - DEFAULT_FILE_STORAGE = "storages.backends.dropbox.DropboxStorage" +Django 4.2 changed the way file storage objects are configured. In particular, it made it easier to independently configure +storage backends and add additional ones. To configure multiple storage objects pre Django 4.2 required subclassing the backend +because the settings were global, now you pass them under the key ``OPTIONS``. For example, to save media files to Dropbox on Django +>= 4.2 you'd define:: - # django >= 4.2 - STORAGES = {"default": {"BACKEND": "storages.backends.dropbox.DropboxStorage"}} -Two methods of authenticating are supported: + STORAGES = { + "default": { + "BACKEND": "storages.backends.dropbox.DropboxStorage", + "OPTIONS": { + ...your_options_here + }, + }, + } -1. using an access token -2. using a refresh token with an app key and secret +On Django < 4.2 you'd instead define:: -Dropbox has recently introduced short-lived access tokens only, and does not seem to allow new apps to generate access tokens that do not expire. Short-lived access tokens can be indentified by their prefix (short-lived access tokens start with ``'sl.'``). + DEFAULT_FILE_STORAGE = "storages.backends.dropbox.DropboxStorage" -Please set the following variables accordingly: +To put static files on Dropbox via ``collectstatic`` on Django >= 4.2 you'd include the ``staticfiles`` key (at the same level as +``default`` above inside of the ``STORAGES`` dictionary while on Django < 4.2 you'd instead define:: -``DROPBOX_OAUTH2_TOKEN`` - Your Dropbox token. You can obtain one by following the instructions in the `tutorial`_. + STATICFILES_STORAGE = "storages.backends.dropbox.DropboxStorage" -``DROPBOX_APP_KEY`` - Your Dropbox appkey. You can obtain one by following the instructions in the `tutorial`_. +The settings documented in the following sections include both the key for ``OPTIONS`` (and subclassing) as +well as the global value. Given the significant improvements provided by the new API, migration is strongly encouraged. -``DROPBOX_APP_SECRET`` - Your Dropbox secret. You can obtain one by following the instructions in the `tutorial`_. +Authentication +-------------- -``DROPBOX_OAUTH2_REFRESH_TOKEN`` - Your Dropbox refresh token. You can obtain one by following the instructions in the `tutorial`_. +Two methods of authentication are supported: -The refresh token can be obtained using the `commandline-oauth.py`_ example from the `Dropbox SDK for Python`_. +#. Using an access token +#. Using a refresh token with an app key and secret -``DROPBOX_ROOT_PATH`` (optional, default ``"/"``) - Path which will prefix all uploaded files. Must begin with a ``/``. +Dropbox has recently introduced short-lived access tokens only, and does not seem to allow new apps to generate access tokens that do not expire. Short-lived access tokens can be indentified by their prefix (short-lived access tokens start with ``'sl.'``). -``DROPBOX_TIMEOUT`` (optional, default ``100``) - Timeout in seconds for requests to the API. If ``None``, the client will wait forever. - The default value matches the SDK at the time of this writing. +You can manually obtain the refresh token by following the instructions below using ``APP_KEY`` and ``APP_SECRET``. -``DROPBOX_WRITE_MODE`` (optional, default ``"add"``) - Sets the Dropbox WriteMode strategy. Read more in the `official docs`_. +The relevant settings which can all be obtained by following the instructions in the `tutorial`_: -Obtain the refresh token manually -################################# +#. ``oauth2_access_token`` or ``DROPBOX_OAUTH2_TOKEN`` +#. ``oauth2_refresh_token`` or ``DROPBOX_OAUTH2_REFRESH_TOKEN`` +#. ``app_secret`` or ``DROPBOX_APP_SECRET`` +#. ``app_key`` or ``DROPBOX_APP_KEY`` -You can obtail the refresh token manually via ``APP_KEY`` and ``APP_SECRET``. +The refresh token can be obtained using the `commandline-oauth.py`_ example from the `Dropbox SDK for Python`_. Get AUTHORIZATION_CODE -********************** +~~~~~~~~~~~~~~~~~~~~~~ Using your ``APP_KEY`` follow the link: @@ -68,7 +73,7 @@ Using your ``APP_KEY`` follow the link: It will give you ``AUTHORIZATION_CODE``. Obtain the refresh token -************************* +~~~~~~~~~~~~~~~~~~~~~~~~ Usinh your ``APP_KEY``, ``APP_SECRET`` and ``AUTHORIZATION_KEY`` obtain the refresh token. @@ -93,6 +98,29 @@ The response would be: "account_id": "dbid:************************" } +Settings +-------- + +``root_path`` or ``DROPBOX_ROOT_PATH`` + + Default: ``'/'`` + + Path which will prefix all uploaded files. Must begin with a ``/``. + +``timeout`` or ``DROPBOX_TIMEOUT`` + + Default: ``100`` + + Timeout in seconds for requests to the API. If ``None``, the client will wait forever. + The default value matches the SDK at the time of this writing. + +``write_mode`` or ``DROPBOX_WRITE_MODE`` + + Default: ``'add'`` + + Sets the Dropbox WriteMode strategy. Read more in the `official docs`_. + + .. _`tutorial`: https://www.dropbox.com/developers/documentation/python#tutorial .. _`Dropbox SDK for Python`: https://www.dropbox.com/developers/documentation/python#tutorial .. _`official docs`: https://dropbox-sdk-python.readthedocs.io/en/latest/api/files.html#dropbox.files.WriteMode diff --git a/docs/backends/gcloud.rst b/docs/backends/gcloud.rst index e62b3184..57267154 100644 --- a/docs/backends/gcloud.rst +++ b/docs/backends/gcloud.rst @@ -12,103 +12,91 @@ Use pip to install from PyPI:: pip install django-storages[google] -Authentication --------------- -By default, this library will try to use the credentials associated with the -current Google Cloud infrastructure/environment for authentication. - -In most cases, the default service accounts are not sufficient to read/write and sign files in GCS, so you will need to create a dedicated service account: - -1. Create a service account. (`Google Getting Started Guide `__) - -2. Make sure your service account has access to the bucket and appropriate permissions. (`Using IAM Permissions `__) +Configuration & Settings +------------------------ -3. Ensure this service account is associated to the type of compute being used (Google Compute Engine (GCE), Google Kubernetes Engine (GKE), Google Cloud Run (GCR), etc) +Django 4.2 changed the way file storage objects are configured. In particular, it made it easier to independently configure +storage backends and add additional ones. To configure multiple storage objects pre Django 4.2 required subclassing the backend +because the settings were global, now you pass them under the key ``OPTIONS``. For example, to save media files to GCS on Django +>= 4.2 you'd define:: -For development use cases, or other instances outside Google infrastructure: - -4. Create the key and download `your-project-XXXXX.json` file. - -5. Ensure the key is mounted/available to your running Django app. -6. Set an environment variable of GOOGLE_APPLICATION_CREDENTIALS to the path of the json file. + STORAGES = { + "default": { + "BACKEND": "storages.backends.gcloud.GoogleCloudStorage", + "OPTIONS": { + ...your_options_here + }, + }, + } -Alternatively, you can use the setting `GS_CREDENTIALS` as described below. +On Django < 4.2 you'd instead define:: + DEFAULT_FILE_STORAGE = "storages.backends.gcloud.GoogleCloudStorage" -Getting Started ---------------- -Set the default storage and bucket name in your settings.py file +To put static files on GCS via ``collectstatic`` on Django >= 4.2 you'd include the ``staticfiles`` key (at the same level as +``default`` above inside of the ``STORAGES`` dictionary while on Django < 4.2 you'd instead define:: -.. code-block:: python + STATICFILES_STORAGE = "storages.backends.gcloud.GoogleCloudStorage" - # django < 4.2 - DEFAULT_FILE_STORAGE = 'storages.backends.gcloud.GoogleCloudStorage' +The settings documented in the following sections include both the key for ``OPTIONS`` (and subclassing) as +well as the global value. Given the significant improvements provided by the new API, migration is strongly encouraged. - # django >= 4.2 - STORAGES = {"default": {"BACKEND": "storages.backends.gcloud.GoogleCloudStorage"}} - - GS_BUCKET_NAME = 'YOUR_BUCKET_NAME_GOES_HERE' +Authentication Settings +~~~~~~~~~~~~~~~~~~~~~~~ +By default, this library will try to use the credentials associated with the +current Google Cloud infrastructure/environment for authentication. +In most cases, the default service accounts are not sufficient to read/write and sign files in GCS, so you will need to create a dedicated service account: -To allow ``django-admin`` collectstatic to automatically put your static files in your bucket set the following in your settings.py:: +#. Create a service account. (`Google Getting Started Guide `__) +#. Make sure your service account has access to the bucket and appropriate permissions. (`Using IAM Permissions `__) +#. Ensure this service account is associated to the type of compute being used (Google Compute Engine (GCE), Google Kubernetes Engine (GKE), Google Cloud Run (GCR), etc) - # django < 4.2 - STATICFILES_STORAGE = 'storages.backends.gcloud.GoogleCloudStorage' +For development use cases, or other instances outside Google infrastructure: - # django >= 4.2 - STORAGES = {"staticfiles": {"BACKEND": "storages.backends.gcloud.GoogleCloudStorage"}} +#. Create the key and download ``your-project-XXXXX.json`` file. +#. Ensure the key is mounted/available to your running Django app. +#. Set an environment variable of GOOGLE_APPLICATION_CREDENTIALS to the path of the json file. -Once you're done, default_storage will be Google Cloud Storage +Alternatively, you can use the setting ``credentials`` or ``GS_CREDENTIALS`` as described below. -.. code-block:: python - >>> from django.core.files.storage import default_storage - >>> print(default_storage.__class__) - +Settings +~~~~~~~~ -This way, if you define a new FileField, it will use the Google Cloud Storage +``bucket_name`` or ``GS_BUCKET_NAME`` -.. code-block:: python + **Required** - >>> from django.db import models - >>> class Resume(models.Model): - ... pdf = models.FileField(upload_to='pdfs') - ... photos = models.ImageField(upload_to='photos') - ... - >>> resume = Resume() - >>> print(resume.pdf.storage) - + The name of the GCS bucket that will host the files. -Settings --------- +``project_id`` or ``GS_PROJECT_ID`` -``GS_BUCKET_NAME`` + default: ``None`` -Your Google Storage bucket name, as a string. Required. + Your Google Cloud project ID. If unset, falls back to the default inferred from the environment. -``GS_PROJECT_ID`` (optional) +``gzip`` or ``GS_IS_GZIPPED`` -Your Google Cloud project ID. If unset, falls back to the default -inferred from the environment. + default: ``False`` -``GS_IS_GZIPPED`` (optional: default is ``False``) + Whether or not to enable gzipping of content types specified by ``gzip_content_types``. -Whether or not to enable gzipping of content types specified by ``GZIP_CONTENT_TYPES`` +``gzip_content_types`` or ``GZIP_CONTENT_TYPES`` -``GZIP_CONTENT_TYPES`` (optional: default is ``text/css``, ``text/javascript``, ``application/javascript``, ``application/x-javascript``, ``image/svg+xml``) + default: ``(text/css,text/javascript,application/javascript,application/x-javascript,image/svg+xml)`` -When ``GS_IS_GZIPPED`` is set to ``True`` the content types which will be gzipped + The list of content types to be gzipped when ``gzip`` is ``True``. .. _gs-creds: -``GS_CREDENTIALS`` (optional) +``credentials`` or ``GS_CREDENTIALS`` -The OAuth 2 credentials to use for the connection. If unset, falls -back to the default inferred from the environment -(i.e. GOOGLE_APPLICATION_CREDENTIALS) + default: ``None`` -:: + The OAuth 2 credentials to use for the connection. If unset, falls back to the default inferred from the environment + (i.e. ``GOOGLE_APPLICATION_CREDENTIALS``):: from google.oauth2 import service_account @@ -118,18 +106,20 @@ back to the default inferred from the environment .. _gs-default-acl: -``GS_DEFAULT_ACL`` (optional, default is None) +``default_acl`` or ``GS_DEFAULT_ACL`` + + default: ``None`` -ACL used when creating a new blob, from the -`list of predefined ACLs `_. -(A "JSON API" ACL is preferred but an "XML API/gsutil" ACL will be -translated.) + ACL used when creating a new blob, from the + `list of predefined ACLs `_. + (A "JSON API" ACL is preferred but an "XML API/gsutil" ACL will be + translated.) -For most cases, the blob will need to be set to the ``publicRead`` ACL in order for the file to be viewed. -If ``GS_DEFAULT_ACL`` is not set, the blob will have the default permissions set by the bucket. + For most cases, the blob will need to be set to the ``publicRead`` ACL in order for the file to be viewed. + If ``default_acl`` is not set, the blob will have the default permissions set by the bucket. -``publicRead`` files will return a public, non-expiring url. All other files return -a signed (expiring) url. + ``publicRead`` files will return a public, non-expiring url. All other files return + a signed (expiring) url. .. note:: GS_DEFAULT_ACL must be set to 'publicRead' to return a public url. Even if you set @@ -141,186 +131,91 @@ a signed (expiring) url. already have a bucket with ``Uniform`` access control set to public read, please keep ``GS_DEFAULT_ACL`` to ``None`` and set ``GS_QUERYSTRING_AUTH`` to ``False``. -``GS_QUERYSTRING_AUTH`` (optional, default is True) - -If set to ``False`` it forces the url not to be signed. This setting is useful if you need to have a -bucket configured with ``Uniform`` access control configured with public read. In that case you should -force the flag ``GS_QUERYSTRING_AUTH = False`` and ``GS_DEFAULT_ACL = None`` +``querystring_auth`` or ``GS_QUERYSTRING_AUTH`` -``GS_FILE_OVERWRITE`` (optional: default is ``True``) + default: ``True`` -By default files with the same name will overwrite each other. Set this to ``False`` to have extra characters appended. + If set to ``False`` it forces the url not to be signed. This setting is useful if you need to have a + bucket configured with ``Uniform`` access control configured with public read. In that case you should + force the flag ``GS_QUERYSTRING_AUTH = False`` and ``GS_DEFAULT_ACL = None`` -``GS_MAX_MEMORY_SIZE`` (optional) +``file_overwrite`` or ``GS_FILE_OVERWRITE`` -The maximum amount of memory a returned file can take up (in bytes) before being -rolled over into a temporary file on disk. Default is 0: Do not roll over. - -``GS_BLOB_CHUNK_SIZE`` (optional: default is ``None``) - -The size of blob chunks that are sent via resumable upload. If this is not set then the generated request -must fit in memory. Recommended if you are going to be uploading large files. - -.. note:: - - This must be a multiple of 256K (1024 * 256) + default: ``True`` -``GS_OBJECT_PARAMETERS`` (optional: default is ``{}``) + By default files with the same name will overwrite each other. Set this to ``False`` to have extra characters appended. -Dictionary of key-value pairs mapping from blob property name to value. +``max_memory_size`` or ``GS_MAX_MEMORY_SIZE`` -Use this to set parameters on all objects. To set these on a per-object -basis, subclass the backend and override ``GoogleCloudStorage.get_object_parameters``. + default: ``0`` i.e do not rollover -The valid property names are :: + The maximum amount of memory a returned file can take up (in bytes) before being + rolled over into a temporary file on disk. Default is 0: Do not roll over. - acl - cache_control - content_disposition - content_encoding - content_language - content_type - metadata - storage_class +``blob_chunk_size`` or ``GS_BLOB_CHUNK_SIZE`` -If not set, the ``content_type`` property will be guessed. + default: ``None`` -If set, ``acl`` overrides :ref:`GS_DEFAULT_ACL `. + The size of blob chunks that are sent via resumable upload. If this is not set then the generated request + must fit in memory. Recommended if you are going to be uploading large files. -.. warning:: - - Do not set ``name``. This is set automatically based on the filename. - -``GS_CUSTOM_ENDPOINT`` (optional: default is ``None``) - -Sets a `custom endpoint `_, -that will be used instead of ``https://storage.googleapis.com`` when generating URLs for files. - -``GS_LOCATION`` (optional: default is ``''``) - -Subdirectory in which the files will be stored. -Defaults to the root of the bucket. - -``GS_EXPIRATION`` (optional: default is ``timedelta(seconds=86400)``) - -The time that a generated URL is valid before expiration. The default is 1 day. -Public files will return a url that does not expire. Files will be signed by -the credentials provided to django-storages (See :ref:`GS Credentials `). - -Note: Default Google Compute Engine (GCE) Service accounts are -`unable to sign urls `_. - -The ``GS_EXPIRATION`` value is handled by the underlying `Google library `_. -It supports `timedelta`, `datetime`, or `integer` seconds since epoch time. - -Note: The maximum value for this option is 7 days (604800 seconds) in version `v4` (See this `Github issue `_) - -Usage ------ - -Fields -^^^^^^ - -Once you're done, default_storage will be Google Cloud Storage - -.. code-block:: python - - >>> from django.core.files.storage import default_storage - >>> print(default_storage.__class__) - - -This way, if you define a new FileField, it will use the Google Cloud Storage - -.. code-block:: python +.. note:: - >>> from django.db import models - >>> class Resume(models.Model): - ... pdf = models.FileField(upload_to='pdfs') - ... photos = models.ImageField(upload_to='photos') - ... - >>> resume = Resume() - >>> print(resume.pdf.storage) - + This must be a multiple of 256K (1024 * 256) -Storage -^^^^^^^ +``object_parameters`` or ``GS_OBJECT_PARAMETERS`` -Standard file access options are available, and work as expected + default: `{}` -.. code-block:: python + Dictionary of key-value pairs mapping from blob property name to value. - >>> default_storage.exists('storage_test') - False - >>> file = default_storage.open('storage_test', 'w') - >>> file.write('storage contents') - >>> file.close() + Use this to set parameters on all objects. To set these on a per-object + basis, subclass the backend and override ``GoogleCloudStorage.get_object_parameters``. - >>> default_storage.exists('storage_test') - True - >>> file = default_storage.open('storage_test', 'r') - >>> file.read() - 'storage contents' - >>> file.close() + The valid property names are :: - >>> default_storage.delete('storage_test') - >>> default_storage.exists('storage_test') - False + acl + cache_control + content_disposition + content_encoding + content_language + content_type + metadata + storage_class -Model -^^^^^ + If not set, the ``content_type`` property will be guessed. -An object without a file has limited functionality + If set, ``acl`` overrides :ref:`GS_DEFAULT_ACL `. -.. code-block:: python +.. warning:: - >>> obj1 = Resume() - >>> obj1.pdf - - >>> obj1.pdf.size - Traceback (most recent call last): - ... - ValueError: The 'pdf' attribute has no file associated with it. + Do not set ``name``. This is set automatically based on the filename. -Saving a file enables full functionality +``custom_endpoint`` or ``GS_CUSTOM_ENDPOINT`` -.. code-block:: python + default: ``None`` - >>> obj1.pdf.save('django_test.txt', ContentFile('content')) - >>> obj1.pdf - - >>> obj1.pdf.size - 7 - >>> obj1.pdf.read() - 'content' + Sets a `custom endpoint `_, + that will be used instead of ``https://storage.googleapis.com`` when generating URLs for files. -Files can be read in a little at a time, if necessary +``location`` or ``GS_LOCATION`` -.. code-block:: python + default: ``''`` - >>> obj1.pdf.open() - >>> obj1.pdf.read(3) - 'con' - >>> obj1.pdf.read() - 'tent' - >>> '-'.join(obj1.pdf.chunks(chunk_size=2)) - 'co-nt-en-t' + Subdirectory in which files will be stored. -Save another file with the same name +``expiration`` or ``GS_EXPIRATION`` -.. code-block:: python + default: ``timedelta(seconds=86400)``) - >>> obj2 = Resume() - >>> obj2.pdf.save('django_test.txt', ContentFile('more content')) - >>> obj2.pdf - - >>> obj2.pdf.size - 12 + The time that a generated URL is valid before expiration. The default is 1 day. + Public files will return a url that does not expire. Files will be signed by + the credentials provided to django-storages (See :ref:`GS Credentials `). -Push the objects into the cache to make sure they pickle properly + Note: Default Google Compute Engine (GCE) Service accounts are + `unable to sign urls `_. -.. code-block:: python + The ``expiration`` value is handled by the underlying `Google library `_. + It supports `timedelta`, `datetime`, or `integer` seconds since epoch time. - >>> cache.set('obj1', obj1) - >>> cache.set('obj2', obj2) - >>> cache.get('obj2').pdf - + Note: The maximum value for this option is 7 days (604800 seconds) in version `v4` (See this `Github issue `_) diff --git a/docs/backends/oracle-cloud.rst b/docs/backends/oracle-cloud.rst index ee66334d..d2177cb7 100644 --- a/docs/backends/oracle-cloud.rst +++ b/docs/backends/oracle-cloud.rst @@ -5,12 +5,12 @@ Oracle cloud provides S3 compatible object storage. To use it follow the instruc configurations on settings.py - Create a `Customer Secret Key`_ -- Use generated key as ``AWS_SECRET_ACCESS_KEY`` -- And the value in the *Access Key* column as ``AWS_ACCESS_KEY_ID`` -- Set ``AWS_STORAGE_BUCKET_NAME`` with your bucket name -- Set ``AWS_S3_REGION_NAME`` with the current region +- Use generated key as ``secret_key`` +- And the value in the *Access Key* column as ``access_key`` +- Set ``bucket_name`` with your bucket name +- Set ``region_name`` with the current region -And last but most importantly set the ``AWS_S3_ENDPOINT_URL`` with: +And last but most importantly set the ``endpoint_url`` with: ``https://{ORACLE_NAMESPACE}.compat.objectstorage.{ORACLE_REGION}.oraclecloud.com`` @@ -29,4 +29,4 @@ References .. _Oracle object storage namespaces docs: https://docs.oracle.com/en-us/iaas/Content/Object/Tasks/understandingnamespaces.htm#Understanding_Object_Storage_Namespaces .. _Amazon S3 Compatibility API docs: https://docs.oracle.com/en-us/iaas/Content/Object/Tasks/s3compatibleapi.htm# .. _Amazon S3 Compatibility API endpoints: https://docs.oracle.com/en-us/iaas/api/#/en/s3objectstorage/20160918/ -.. _Customer Secret Key: https://docs.oracle.com/en-us/iaas/Content/Identity/Tasks/managingcredentials.htm#To4 \ No newline at end of file +.. _Customer Secret Key: https://docs.oracle.com/en-us/iaas/Content/Identity/Tasks/managingcredentials.htm#To4 diff --git a/docs/backends/sftp.rst b/docs/backends/sftp.rst index d0930356..5141c273 100644 --- a/docs/backends/sftp.rst +++ b/docs/backends/sftp.rst @@ -1,72 +1,127 @@ SFTP ==== -Settings --------- +Installation +------------ + +Install via:: + + pip install django-storages[sftp] + +Configuration & Settings +------------------------ -Set the default storage (i.e: for media files) and the static storage (i.e: for -static files) to use the sftp backend:: +Django 4.2 changed the way file storage objects are configured. In particular, it made it easier to independently configure +storage backends and add additional ones. To configure multiple storage objects pre Django 4.2 required subclassing the backend +because the settings were global, now you pass them under the key ``OPTIONS``. For example, to save media files to SFTP on Django +>= 4.2 you'd define:: + + + STORAGES = { + "default": { + "BACKEND": "storages.backends.sftpstorage.SFTPStorage", + "OPTIONS": { + ...your_options_here + }, + }, + } + +On Django < 4.2 you'd instead define:: - # django < 4.2 DEFAULT_FILE_STORAGE = "storages.backends.sftpstorage.SFTPStorage" + +To put static files on SFTP via ``collectstatic`` on Django >= 4.2 you'd include the ``staticfiles`` key (at the same level as +``default`` above inside of the ``STORAGES`` dictionary while on Django < 4.2 you'd instead define:: + STATICFILES_STORAGE = "storages.backends.sftpstorage.SFTPStorage" - # django >= 4.2 - STORAGES = { - "default": {"BACKEND": "storages.backends.sftpstorage.SFTPStorage"}, - "staticfiles": {"BACKEND": "storages.backends.sftpstorage.SFTPStorage"}, - } +The settings documented in the following sections include both the key for ``OPTIONS`` (and subclassing) as +well as the global value. Given the significant improvements provided by the new API, migration is strongly encouraged. + +Settings +~~~~~~~~ + +``host`` or ``SFTP_STORAGE_HOST`` + + **Required** + + The hostname where you want the files to be saved. + +``root_path`` or ``SFTP_STORAGE_ROOT`` + + Default: ``''`` + + The root directory on the remote host into which files should be placed. + Should work the same way that ``STATIC_ROOT`` works for local files. Must + include a trailing slash. + +``params`` or ``SFTP_STORAGE_PARAMS`` + + Default: ``{}`` + + A dictionary containing connection parameters to be passed as keyword + arguments to ``paramiko.SSHClient().connect()`` (do not include hostname here). + See `paramiko SSHClient.connect() documentation`_ for details + +``interactive`` or ``SFTP_STORAGE_INTERACTIVE`` + + Default: ``False`` + + A boolean indicating whether to prompt for a password if the connection cannot + be made using keys, and there is not already a password in + ``params``. You can set this to ``True`` to enable interactive + login when running ``manage.py collectstatic``, for example. + + .. warning:: + + DO NOT set ``interactive`` to ``True`` if you are using this storage + for files being uploaded to your site by users, because you'll have no way + to enter the password when they submit the form + +``file_mode`` or ``SFTP_STORAGE_FILE_MODE`` + + Default: ``None`` + + A bitmask for setting permissions on newly-created files. See + `Python os.chmod documentation`_ for acceptable values. + +``dir_mode`` or ``SFTP_STORAGE_DIR_MODE`` + + Default: ``None`` + A bitmask for setting permissions on newly-created directories. See + `Python os.chmod documentation`_ for acceptable values. -``SFTP_STORAGE_HOST`` - The hostname where you want the files to be saved. + .. note:: -``SFTP_STORAGE_ROOT`` - The root directory on the remote host into which files should be placed. - Should work the same way that ``STATIC_ROOT`` works for local files. Must - include a trailing slash. + Hint: if you start the mode number with a 0 you can express it in octal + just like you would when doing "chmod 775 myfile" from bash. -``SFTP_STORAGE_PARAMS`` (optional) - A dictionary containing connection parameters to be passed as keyword - arguments to ``paramiko.SSHClient().connect()`` (do not include hostname here). - See `paramiko SSHClient.connect() documentation`_ for details +``uid`` or ``SFTP_STORAGE_UID`` -``SFTP_STORAGE_INTERACTIVE`` (optional) - A boolean indicating whether to prompt for a password if the connection cannot - be made using keys, and there is not already a password in - ``SFTP_STORAGE_PARAMS``. You can set this to ``True`` to enable interactive - login when running ``manage.py collectstatic``, for example. + Default: ``None`` - .. warning:: + UID of the account that should be set as the owner of the files on the remote + host. You may have to be root to set this. - DO NOT set SFTP_STORAGE_INTERACTIVE to True if you are using this storage - for files being uploaded to your site by users, because you'll have no way - to enter the password when they submit the form.. +``gid`` or ``SFTP_STORAGE_GID`` -``SFTP_STORAGE_FILE_MODE`` (optional) - A bitmask for setting permissions on newly-created files. See - `Python os.chmod documentation`_ for acceptable values. + Default: ``None`` -``SFTP_STORAGE_DIR_MODE`` (optional) - A bitmask for setting permissions on newly-created directories. See - `Python os.chmod documentation`_ for acceptable values. + GID of the group that should be set on the files on the remote host. You have + to be a member of the group to set this. - .. note:: +``known_host_file`` or ``SFTP_KNOWN_HOST_FILE`` - Hint: if you start the mode number with a 0 you can express it in octal - just like you would when doing "chmod 775 myfile" from bash. + Default: ``None`` -``SFTP_STORAGE_UID`` (optional) - UID of the account that should be set as the owner of the files on the remote - host. You may have to be root to set this. + Absolute path of know host file, if it isn't set ``"~/.ssh/known_hosts"`` will be used. -``SFTP_STORAGE_GID`` (optional) - GID of the group that should be set on the files on the remote host. You have - to be a member of the group to set this. +``base_url`` -``SFTP_KNOWN_HOST_FILE`` (optional) - Absolute path of know host file, if it isn't set ``"~/.ssh/known_hosts"`` will be used. + Default: Django ``MEDIA_URL`` setting + The URL to serve files from. .. _`paramiko SSHClient.connect() documentation`: http://docs.paramiko.org/en/latest/api/client.html#paramiko.client.SSHClient.connect