Skip to content
This repository has been archived by the owner on Sep 12, 2023. It is now read-only.

Commit

Permalink
Merge branch 'release/2.3.0'
Browse files Browse the repository at this point in the history
  • Loading branch information
socketwench committed Apr 6, 2020
2 parents c4cd861 + e5f4968 commit 3c64826
Show file tree
Hide file tree
Showing 9 changed files with 347 additions and 17 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
.idea
test/*
54 changes: 43 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -188,10 +188,7 @@ Where:

## Backing up databases

Tractorbeam supports backing the following databases:

* MySQL/MariaDB databases which are network accessible
* Database relationships on Platform.sh
Tractorbeam supports backing the databases from database servers as well as popular hosting paltforms.

### Backing up MySQL/MariaDB

Expand Down Expand Up @@ -236,6 +233,35 @@ Where:

* **passwordFile** is the full path inside the container to a file containing the database password.

### Backing up from Pantheon.io

Tractorbeam can create backups and upload them off-site to an S3 bucket:

```yaml
tractorbeam:
pantheon:
- site: "wxyz0987"
environment: "live"
element: "database"
machineToken: "abcdefghijklmnop1234567890"
cacheDir: "/config/panth-backup/id_rsa"
bucket: "my_bucket_name"
prefix: "my/custom/prefix"
accessKey: "abcef123456"
secretKey: "abcef123456"
endpoint: "https://sfo2.digitaloceanspaces.com"
```

* **site** is the site ID on Pantheon.io. Required.
* **environment** is the environment to back up. Optional, defaults to `master`.
* **element** is type of backup to perform. This can be `database` or `files`. Optional, defaults to `database`.
* **machineTokenFile** is the full path inside the container to a file containing the [Pantheon.io machine token](https://pantheon.io/docs/machine-tokens). Required.
* **machineToken** is your Pantheon.io machine token. Optional if `machineTokenFile` is defined.

Note that unlike many other backup types, Pantheon backups can be either the database or files. Files are backed up as a dated archive and not a rolling directory of files.

Note that you can associate multiple SSH keys with your Pantheon.io account. It is highly recommended to create a dedicated key for Tractorbeam, rather than share your existing key.

### Backing up Platform.sh DB relationships

This container can also back up a database relationship on your Platform.sh project to S3:
Expand Down Expand Up @@ -270,9 +296,6 @@ Tractorbeam can download and backup a snapshot of a remote directory. Once downl

Downloading and creating snapshot files is a space and bandwidth intensive process. Avoid using this method for directories where the contents are larger than 300MB. Instead, see "Rolling Directory Backups" below.

Tractorbeam supports the following file archive backups:
* SSH-accessible (SFTP/rsync-over-ssh) sources

### Backing up files over SSH

To backup snapshots of a directory over SSH, create the `tractorbeam.archives` item:
Expand Down Expand Up @@ -316,10 +339,6 @@ Note, it is best to always create a dedicated SSH key for Tractorbeam, rather th

For large directories (>300MB), archiving a new snapshot each time is space and bandwidth intensive. In that case, you may wish to do a "rolling" backup. That is, only the most recent contents of the source directory are preserved, with no archiving or timestamping performed. This is useful for website managed file directories where most changes are new files being added, rather than existing files being modified.

Tractorbeam supports the following rolling directory backups:
* SSH-accessible (SFTP/rsync-over-ssh) sources
* S3 to S3

### Caching files for rolling backups

Backing up files can take a considerable amount of bandwidth to synchronize, especially if the files you're backing up only have additions or rare changes. In those cases, you may use the `cacheDir` key to store any downloaded files within the container:
Expand Down Expand Up @@ -351,6 +370,9 @@ tractorbeam:
path: "/path/to/my/files"
delete: true
identityFile: "/config/my-backup/id_rsa"
port: "22"
options:
- "-az"
bucket: "my_bucket_name"
prefix: "my/custom/prefix"
accessKey: "abcef123456"
Expand All @@ -365,9 +387,15 @@ Each item in the list is a SSH/SFTP/rsync-to-S3 rolling backup to perform, where
* **path** is the path on the remote server from which to backup files. Required.
* **delete** specifies if files not present in the source directory should be deleted in S3. Optional, defaults to true.
* **identityFile** is the full path inside the container to the SSH private key with which to connect to the source server. The public key must be in the same directory. Required.
* **port** is the SSH port number if an alternative port is used. Optional.
* **options** is a list of options to pass to the `rsync` command. Optional, defaults to a list of a single option, `-az`.

Note, it is best to always create a dedicated SSH key for Tractorbeam, rather than share your existing SSH keys.

### Backing up Pantheon files

See the Backing up from Pantheon.io section above.

### Backing up Platform.sh file mounts

This container can also back up a file mount on your Platform.sh project to S3:
Expand Down Expand Up @@ -409,6 +437,7 @@ tractorbeam:
srcSecretKey: "abcef123456"
srcEndpoint: "https://sfo2.digitaloceanspaces.com"
bucket: "my_redundant_bucket"
method: "s3cmd"
delete: yes
prefix: "my/custom/prefix"
accessKey: "vwxyz098765"
Expand All @@ -424,9 +453,12 @@ Each item in the list is a, s3-to-s3 backup to perform, where:
* **srcRegion** is the S3 region in which `srcBucket` resides. Optional.
* **srcEndpoint** is the S3 API endpoint to use for the source bucket. Optional, defaults to AWS S3.
* **delete** specifies if files not present in the source bucket should be deleted in the target bucket. Optional, defaults to true.
* **method** specifies the command to use to perform the sync. Must be `awcli` or `s3cmd`. Optional, default is `s3cmd`.

By design, the S3-to-S3 backup is always performed *last* in Tractorbeam. This allows you to mirror previous backups easily.

Note that Tractorbeam uses [s3cmd](https://s3tools.org/s3cmd) instead of the AWS CLI to perform the sync by default. s3cmd works slightly faster, and handles deep directories better. If you experience problems, you may fall back to the AWS CLI

## Deployment

Tractorbeam may be deployed in several ways, including base Docker, Docker Compose, Swarm, and Kubernetes.
Expand Down
2 changes: 1 addition & 1 deletion ansible/group_vars/backup.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
flighdeck_motd_name: "Tractorbeam 2.2.2"
flighdeck_motd_name: "Tractorbeam 2.3.0"

flightdeck_groups:
- name: "backup"
Expand Down
2 changes: 2 additions & 0 deletions ansible/roles/tractorbeam/tasks/databases.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@
login_host: "{{ _backup.host | default('localhost') }}"
login_port: "{{ _backup.port | default('3306') | int }}"
login_user: "{{ _backup.user | default(omit) }}"
client_cert: "{{ _backup.tlscert | default(omit) }}"
client_key: "{{ _backup.tlskey | default(omit) }}"
login_password: "\
{% if _backup.passwordFile is defined %}\
{{ lookup('file', _backup.passwordFile) }}\
Expand Down
15 changes: 13 additions & 2 deletions ansible/roles/tractorbeam/tasks/files.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,22 @@
register: _files_temp_dir
when:
- _backup.cacheDir is not defined
- name: Create a cache directory for the backup
file:
state: directory
path: "{{ _backup.cacheDir }}"
owner: "backup"
group: "backup"
mode: "u=rwx,g=rxw,o=rwx"
when:
- _backup.cacheDir is defined
- name: Sync files from remote
shell: >
rsync
-az
-e "ssh -i {{ _backup.identityFile }} -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null"
{% for _option in _backup.options | default(['-az']) %}
{{ _option }}{{ ' ' }}
{% endfor %}
-e "ssh -i {{ _backup.identityFile }} -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null {% if _backup.port is defined %}-p {{ _backup.port }}{% endif %}"
{% for _exclude in _backup.excludes | default(_tractorbeam_default_files_excludes) %}
--exclude={{ _exclude }}{{ ' ' }}
{% endfor %}
Expand Down
11 changes: 11 additions & 0 deletions ansible/roles/tractorbeam/tasks/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,17 @@
when:
- _backup.disabled | default(false) == false
- tractorbeam_scope in _backup.backupSets | default(_tractorbeam_default_sets_for_backup)
- name: Work with pantheon backups
include_tasks: "pantheon.yml"
vars:
_retain_count: "{{ _backupSet.retainCount }}"
loop: "{{ tractorbeam.pantheon | default([]) }}"
loop_control:
loop_var: _backup
no_log: "{{ flightdeck_debug | default(false) | ternary(false, true) }}"
when:
- _backup.disabled | default(false) == false
- tractorbeam_scope in _backup.backupSets | default(_tractorbeam_default_sets_for_backup)
- name: Work with platform.sh databases
include_tasks: "platformshDatabases.yml"
vars:
Expand Down
155 changes: 155 additions & 0 deletions ansible/roles/tractorbeam/tasks/pantheon.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,155 @@
---
- name: Create a temp directory to stage the backup
tempfile:
state: directory
prefix: "files-{{ _backup.bucket }}-{{ _backup_timestamp }}"
register: _pantheon_temp_dir
when:
- _backup.cacheDir is not defined
- name: Create a cache directory for the backup
file:
state: directory
path: "{{ _backup.cacheDir }}"
owner: "backup"
group: "backup"
mode: "u=rwx,g=rxw,o=rwx"
when:
- _backup.cacheDir is defined
- name: Authenticate with Pantheon
shell: >
terminus auth:login
--machine-token={{ _auth }}
{% if (flightdeck_debug | default(false)) != true %}
--quiet
{% elif lookup('env', 'ANSIBLE_VERBOSITY') == 4 %}
-vvv
{% endif %}
--no-interaction
--yes
vars:
_auth: "\
{% if _backup.machineTokenFile is defined %}\
{{ lookup('file', _backup.machineTokenFile) }}\
{% else %}\
{{ _backup.machineToken }}\
{% endif %}"
- name: Create backup
shell: >
terminus backup:create
{{ _backup.site }}.{{ _backup.environment }}
--element={{ _backup.element | default('database') }}
{% if (flightdeck_debug | default(false)) != true %}
--quiet
{% elif lookup('env', 'ANSIBLE_VERBOSITY') == 4 %}
-vvv
{% endif %}
--no-interaction
--yes
- name: Get list of backups
shell: >
terminus backup:list
{{ _backup.site }}.{{ _backup.environment }}
--element={{ _backup.element | default('database') }}
--field=file
{% if lookup('env', 'ANSIBLE_VERBOSITY') == 4 %}
-vvv
{% endif %}
--no-interaction
--yes
register: _pantheon_backup_list
- name: Get the most recent backup
set_fact:
_pantheon_backup_filename: "{{ _pantheon_backup_list.stdout_lines | default([]) | first }}"
- name: Download backup
shell: >
terminus backup:get
{{ _backup.site }}.{{ _backup.environment }}
--file={{ _pantheon_backup_filename }}
--to={{ _backup.cacheDir | default(_pantheon_temp_dir.path) }}/{{ _pantheon_backup_filename }}
{% if lookup('env', 'ANSIBLE_VERBOSITY') == 4 %}
-vvv
{% endif %}
--no-interaction
--yes
args:
creates: "{{ _backup.cacheDir | default(_pantheon_temp_dir.path) }}/{{ _pantheon_backup_filename }}"
- name: Push backup to S3
aws_s3:
bucket: "{{ _backup.bucket }}"
object: "{{ _backup.prefix | default('') }}/{{ tractorbeam_scope | default('') }}/{{ _pantheon_backup_filename }}"
src: "{{ _backup.cacheDir | default(_pantheon_temp_dir.path) }}/{{ _pantheon_backup_filename }}"
mode: put
s3_url: "{{ _backup.endpoint | default(omit) }}"
region: "{{ _backup.region | default(omit) }}"
encrypt: no
permission: private
aws_access_key: "\
{% if _backup.accessKeyFile is defined %}\
{{ lookup('file', _backup.accessKeyFile) }}\
{% else %}\
{{ _backup.accessKey | default(omit) }}\
{% endif %}"
aws_secret_key: "\
{% if _backup.secretKeyFile is defined %}\
{{ lookup('file', _backup.secretKeyFile) }}\
{% else %}\
{{ _backup.secretKey | default(omit) }}\
{% endif %}"
no_log: "{{ flightdeck_debug | default(false) | ternary(false, true) }}"
- name: Get list of files
aws_s3:
bucket: "{{ _backup.bucket }}"
prefix: "{{ _backup.prefix }}/{{ tractorbeam_scope | default('') }}"
mode: list
aws_access_key: "\
{% if _backup.accessKeyFile is defined %}\
{{ lookup('file', _backup.accessKeyFile) }}\
{% else %}\
{{ _backup.accessKey | default(omit) }}\
{% endif %}"
aws_secret_key: "\
{% if _backup.secretKeyFile is defined %}\
{{ lookup('file', _backup.secretKeyFile) }}\
{% else %}\
{{ _backup.secretKey | default(omit) }}\
{% endif %}"
s3_url: "{{ _backup.endpoint | default(omit) }}"
register: _list_backup
- name: Filter and sort the list of files
set_fact:
_list_backup_sorted: "{{ _list_backup.s3_keys | select('regex', _regex) | list | sort }}"
vars:
_regex: "\
{% if _backup.element | default('database') == 'database' %}\
\\.sql\\.gz\
{% else %}\
\\.tar\\.gz\
{% endif %}"
no_log: "{{ flightdeck_debug | default(false) | ternary(false, true) }}"
- name: Delete old backups
aws_s3:
bucket: "{{ _backup.bucket }}"
prefix: "{{ _backup.prefix }}/{{ tractorbeam_scope | default('') }}"
mode: delobj
object: "{{ item }}"
aws_access_key: "\
{% if _backup.accessKeyFile is defined %}\
{{ lookup('file', _backup.accessKeyFile) }}\
{% else %}\
{{ _backup.accessKey | default(omit) }}\
{% endif %}"
aws_secret_key: "\
{% if _backup.secretKeyFile is defined %}\
{{ lookup('file', _backup.secretKeyFile) }}\
{% else %}\
{{ _backup.secretKey | default(omit) }}\
{% endif %}"
s3_url: "{{ _backup.endpoint | default(omit) }}"
loop: "{{ _list_backup_sorted[:-(_retain_count | int)] }}"
no_log: "{{ flightdeck_debug | default(false) | ternary(false, true) }}"
- name: Logout of Pantheon
shell: >
terminus auth:logout
--no-interaction
--yes
- include_tasks: "healhcheck.yml"
10 changes: 9 additions & 1 deletion ansible/roles/tractorbeam/tasks/platformshFiles.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,15 @@
register: _platformshFiles_temp_dir
when:
- _backup.cacheDir is not defined

- name: Create a cache directory for the backup
file:
state: directory
path: "{{ _backup.cacheDir }}"
owner: "backup"
group: "backup"
mode: "u=rwx,g=rxw,o=rwx"
when:
- _backup.cacheDir is defined
- name: Copy identity files to ~/.ssh
copy:
src: "{{ item }}"
Expand Down
Loading

0 comments on commit 3c64826

Please sign in to comment.