Joomla : Akeeba backup and Amazon S3 (new version)
1. Secure your automatic backups by
transferring them to the cloud on
Amazon S3 with Akeeba Backup
A GOOD BACKUP STRATEGY IMPLIES NOT ONLY TO HAVE REGULAR AUTOMATIC BACKUPS
BUT ALSO TO PUT THEM IN A SAFE PLACE,
AWAY FROM TECHNICAL PROBLEMS, PERSONAL MISTAKES… OR FROM HACKERS !
1
MARC DECHÈVREV 3
2. Table of contents
Introduction
Objectives
The tools
General procedure
Very briefly (simplified case) | video version
Briefly | text version
Detailed procedure | screenshots version
Configuration of Amazon S3
Configuration of Akeeba Backup
Fetch backups from Amazon S3 with Akeeba
Encryption
Go further with Amazon S3
Interface / Permissions / Logging / Events / Versioning / LifeCycle
Applications to easily manage your files > Cyberduck
Create notifications
Sources
Conclusion
2
3. Introduction
Marc Dechèvre | marc woluweb.be | www.woluweb.be
Member of FeWeb - Belgian Federation of Web
Happy and proud member of the Joomla® Community
Joomla!® Certified Administrator (#16, first French-speaking)
Co-organiser of Joomla User Group (JUG) Wallonie
Founder & organiser of Joomla User Group (JUG) Ouagadougou
Member of Association Francophone des Utilisateurs de Joomla!TM
Member of Joomla Certification Marketing Team
Regular speaker
@ JoomlaDays France & JoomlaDays Pays-Bas
@ JUG Wallonie & JUG Vlaanderen
Author of articles on www.cinnk.com and on slideshare.net/woluweb
3
4. Objective > share a best practice
The objective of this presentation is to share a best practice in terms of BACKUPS
You make regular backups (and you test them) ? Great!
These backups are triggered automatically ? Even better!
But is it enough ? NO!
what if your hosting company has a technical problem?
what if the renewal date of your hosting has been forgotten and if the Hosting Cy has deleted your account?
what if you make a mistake on your website?
and above all: what if a hacker gains access to your site and destroys all your backups?
In all these cases, not only you would lose your site... but also all your backups at the same time!
So that is why automatic and regular backups to the Cloud are a must.
But you should configure that properly and not too quickly, in order to avoid compromising all the websites
and *increase* your risk/exposure instead of *mitigating* it.
This is precisely the purpose of this presentation...
4
5. Objective > share the Joomla love
Why sharing my experience ?
My answer with few keywords : #OpenSource #Joomla #Jpositive #JoomlaUserGroups #JoomlaDays
#Community
With other words, what makes Joomla different is its active Community ☺
Everything that can contribute to improve the quality of the sites we all deliver with Joomla has a positive
impact on the Community
You too, share your experience ! ☺
Note :
I have absolutely no stake in Akeeba Backup and in Amazon S3 ☺ (I am not sponsored, I have no affiliation, I have no personal
advantage, …)
If this presentation is based on those two tools, it is just because they are probably the best of their kinds
5
6. Objective > backup… but also security
Last remark
A good backup strategy(on the server AND in the cloud) is a must…
… but it is complementary to a good security strategy (it does not replace it !)
Indeed, if a site is infected, its backup will also be infected…
Therefore, typically a hacker likes to work in two phases
First get access to the site and put in place a backdoor
But only some time later activate the attack
… so that even if you have a good old backup of a few months ago (which you think is "clean"), even that one is very likely to
be infected as well
NB : how to check if your site is clean ?
See for example the presentation given at JoomlaDay France 2016
« Est-ce que mon site a été hacké ? Est-il propre ? Comment m’en assurer »
https://slides.aesecure.com
6
8. You will need…
Akeeba Backup is probably the best Backup tool for Joomla
The free version is already very powerful
But for Cloud Backup (FTP, Dropbox, Amazon S3, …), PRO version is necessary
General comparison between free/pro versions :
https://www.akeebabackup.com/products/akeeba-backup.html > Feature Comparison
8
9. You will need…
Amazon S3 is a (very well know) storage service in the Cloud
The advantage compared to other possibilities (FTP of competitors)
It allows to manage user rights (“write-only” for more security) and other things
Is infinitely “scalable” : absolutely no size limit (but you pay for the volume you use of course)
Description > https://aws.amazon.com/fr/s3
Free tier > https://aws.amazon.com/fr/free
Tariff > https://aws.amazon.com/fr/s3/pricing
Tarif varies according Region, Volume et type of storage (max. $ 0,03 par Go / month)
But « free tier » for 12 months with 5Gb
Direct link to the Console > https://console.aws.amazon.com/s3
9
10. You will need…
Alternatives to Amazon S3
Of course, Akeeba Backup is compatible not
only with Amazon S3, but also with DropBox,
Microsoft OneDrive, Google Drive, etc (see
screenshot)
If your favourite Cloud solution also allows to
create “write-only” users, then your backups will
be protected from any person having
(legitimately or not) access to your website
Otherwise, we recommend Amazon S3 for the
reasons explained before
Anyway, whatever your Cloud solution, this
presentation will serve you as a guide
10
12. Simplified case (if only one site)
The next video gives an excellent overview of the configuration of Cloud Backup
https://www.akeebabackup.com/videos/1213-akeeba-backup-for-joomla-pro/1628-abtp07-remote-
backup-amazon-s3.html
Nevertheless, it is a simplified case in the sense that it does not tackle the issue of Access Rights
If you only manage one website (or if you create a separate Amazon S3 account for each site), then it is
fine
But if you manage several sites, then you want to avoid that a user (and even more : a hacker !) of a
given site could access / read / download / delete backups of your other sites by using the Amazon S3
credentials* mentioned on that site !
* These credentials are indeed not encrypted : they have to be written in plain text at least in the database in order to let your site connect to
Amazon S3
12
14. 1. Configure Amazon S3
1. Create an account on Amazon Web Services (AWS)
for Amazon Simple Storage Service (Amazon S3)
2. Create a « bucket » (the equivalent of a folder in S3 terminology) and give it a name (without
capital letters, without points, without underscore)
3. Choose your preferred « Amazon S3 Region », for example « EU-Frankfurt »
4. Go to the « AWS Management Console » and then in the menu IAM
5. Create a User and generate its Keys (Access Key ID & Secret Access Key)
6. Create a Group of users & assign the created user
7. For that Group, attach a « Policy » (that policy should only allow the user to write, not to
download, delete or whatever), either by using the Policy generator or by using the example of
policy described hereafter in this presentation
14
15. 2. Configure Akeeba Backup
Create a new profile
1. Make sure that your existing profile does indeed what you want (for example, check potential folder
exclusions, options, …)
2. Click on « profiles management »
3. Check the existing profile and click on the Copy button in order to duplicate it
Click on the Configuration button for the newly created profile
15
16. 2. Configure Akeeba Backup
Configure the new profile
1. Click on the Configuration button
2. Change the name of the Profile (« Profile Description »), for example : Amazon S3
3. Check (if you wish) the option « one-click backup icon »
4. For the field « Post-processing Engine », choose « Upload to Amazon S3 »
5. Click on the “Configure” button next to it
1. Enter the Access Key and the Secret Key for the "write-only" user
2. Make sure that the option « Use SSL » is checked (for your own security)
3. Enter the name of the « Bucket » you created
4. Select the « Amazon S3 Region » you have chosen
5. If you wish the backup to be stored in a subfolder of the Bucket, enter its name in the field « Directory »
Otherwise, leave that field empty
6. If you wish, check the following options « Delete archive after processing » and/or « Disable Multipart Upload »
6. If you like, check « Archive Integrity Check » for extra quality control
16
17. 2. Configure Akeeba Backup
Save and close configuration
Test it by launching a backup with this new profile
Check in Akeeba Backup and on Amazon S3 that the backup has run and has been transferred
Like for any backup, download it and install it on you local server in order to check that everything is OK
17
18. 3. File Management in Amazon S3
It is also possible to make use of more advanced functionalities proposed by Amazon S3
1. Permissions
2. Logging
3. Events
4. Versioning
5. Lifecycle
If you want to manage your backups from time to time, …
1. Go to the Console of Amazon S3
2. Or install for example Cyberduck (then you can manage and even sort your files)
18
20. Amazon S3 > create an account
Go to Amazon S3 website
https://aws.amazon.com/s3 (English)
https://aws.amazon.com/fr/s3 (Français)
Click on "Sign In to the Console"
Enter your email and password
Tip : please, choose a strong password
Want to check whether your passwords are robust enough ?
Just play with https://howsecureismypassword.net and you will
understand ☺
20
21. Amazon S3 > create a Bucket
In the menu, select "Services > Storage >S3"
Click on the button "Create Bucket"
Enter the name of your choice
Given our strategy of a unique “write-only” user for
all the websites (see hereafter), we make the
choice of putting the backups of all the sites in the
same Bucket… but you could make other choices
Choose your preferred Region
attention, the tariff could be slightly different from
region to region. If you wish a European server (be
it for legal or personal reasons), choose for
example Frankfurt
Click on “Next” or "Create“
21
22. Amazon S3 > IAM
You don't want to use your own root account
(“root access keys”, giving all the powers)
when configuring Akeeba Backup. So you
want to create “sub-accounts"
In Amazon S3 terminology, this is called IAM
Users (IAM = “Identity and Access
Management”). With other words, access
rights
To do that, go to Services > IAM
(or –as shown on the screenshot- click on the
suggestion “Get Started with IAM Users”)
22
23. Amazon S3 > create a User
On the page Services > IAM, click on the side
menu "Users"
Click on the button "Create New Users"
Enter the name of the User and check the
option “Programmatic access” (see
screenshot)
Click on “Next: Permissions“
(so far, we don’t have Groups or Policies, so
just keep clicking on Next until you can
Create User)
23
24. Amazon S3 > save “Credentials”
Save your credentials (in a safe place *)
The Access Key ID
The Secret Access Key
(or click on “Download” in order to have them in
a file)
* Use for example a Passwords Manager like
https://lastpass.com or https://1password.com
24
25. Amazon S3 > create Policy
Click on the side menu "Policies"
Click on the button “Create Policy”
On the next screen select “Policy Creator”
25
26. Amazon S3 > configure Policy
At the step “Set Permissions”, check
Effect > Allow
AWS Service > Amazon S3
Action > PutObject
ARN > arn:aws:s3:::mybucket/site-*
where you adapt the name of mybucket
(and where site-* entails that all our backups
have a name starting with “site-”)
So, that way the created user will only be
allowed to store backups of your different
websites, but will not be allowed
To list the files
To download the files
Or to delete the files !
26
27. Amazon S3 > validate Policy
After that, you get the following screen
Click on the button “Next Step”
27
28. Amazon S3 > edit Policy even afterwards
You can see and edit any Policy via the side menu Policies
Example of « write-only » policy here, with
"arn:aws:s3:::mybucket/site-*"
where mybucket is replaced by the name of your bucket
and where site-* means that only files starting with « site-»
can be uploaded
NB : this "policy” can be adapted (even afterwards) according to
your needs. For example you could to give more rights (like the right
to delete backups if you want to use Akeeba's quota management
which allows to keep the last X backups). But again, beware : in that
case a hacker could get access to your bucket and delete the
backups of all your sites…
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowUploadOfMyBackups",
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::mybucket/site-*"
]
}
]
}
28
29. Amazon S3 > create Group + associate Policy
Still on the page Services > IAM
Choose side menu "Groups"
Click on the button "Create New Group"
Go further with creation by attaching the
policy you just created at the previous step
… and of course associate the created User
to that Group
29
31. Akeeba Backup > Cloud Backup (profile)
Creating a new Profile in Akeeba Backup will give you the possibility of launching quickly
The "traditional” backup (being saved on the server itself)
The backup "in the cloud" (in the present case, in Amazon S3)
Go to you site's Administration, go to Components > Akeeba Backup
Click on the button "Profiles Management". Then choose between
Clicking on the "New" button if you want to start from an empty profile
Selecting an existing Profile and clicking on "Copy"
Choose a name for that new Profile
31
32. Akeeba Backup > Cloud Backup (config)
Then see in the previous chapter for the text-explanation of Akeeba Backup's configuration for
Cloud Backup
Want to know more about Cloud Backup ?
See also the slide “Sources” at the end of the presentation for links giving more explanations on
the website of Akeeba
32
33. Akeeba Backup > Cloud Backup (config)
Eventually, your configuration will look like
this (see screenshot)
33
34. Akeeba Backup > Cloud Backup (details)
Test if the backup file is correctly uploaded after a
backup is taken
If you get an error message at that stage, try to check
the option "Disable Multipart uploads"* in the Profile
Configuration of Akeeba Backup
An interesting option (for your regular backups and of
course also for your cloud backups) : check the option
"Archive Integrity Check", which will consume memory
and CPU but will give you extra confidence that your file
is fine.
But don't forget : at the of the day, it is always your
responsibility to regularly check that your backups are
working (a backup configured to ignore some files
and/or tables will perfectly pass the integrity test… but
will not enable you to restore a site !)
* Excerpt from the Manual : If you get a RequestTimeout warning while Akeeba Solo / Akeeba
Backup is trying to upload your backup archive to the cloud, you MUST go to the
Configuration engine and enable the "Disable multipart uploads" option of the S3 engine. If
you don't do that, the upload will not work. You will also have to use a relatively small part size
for archive splitting, around 10-20Mb (depends on the host, your mileage may vary).
NB : during one of our chats with Akeeba, they advised me 50Mb
34
35. Akeeba Backup > Cloud Backup (details)
There is a nice option within Akeeba allowing to
manage Quotas (for example, keep only the last
X backups and delete older backups)
For the backups which stay on your server, this
makes perfect sense
But for backups you send to the Cloud this makes
no sense, as by definition you connect to Amazon
S3 with credentials allowing to upload… but not to
list / download / delete.
So, in practise and in our case, we make sure that
the option "Enable remote file quotas" is disabled
35
36. Akeeba Backup & Amazon S3 > troubleshooting
Having some issues during the upload of your backups to Amazon S3 ?
See the following page from the Documentation of Akeeba Backup
“My backup files are not being uploaded to Amazon S3”
https://www.akeebabackup.com/documentation/troubleshooter/abamazons3.html
36
37. Particular case of Backup OK / Upload NOK
If a backup to Amazon S3 is executed but not uploaded (for different reasons, see above), the
CRON Task “check backup” will not send an error message. Nor the Control Panel.
It is only when you manually go to the list of backups that you would see that (if you look at the
text mentioned on the buttons "upload to (…)").
In a ticket with Akeeba, they confirmed that there is no CRON task that can be put in place in
order to detect a backup which has run but has NOT been uploaded
https://www.akeebabackup.com/support/akeeba-backup-3x/26190-backups-to-amazon-
s3.html
QUESTION : The only question left : could a CRON job be created (probably a new feature I guess) in
order to check those sites with
- backup OK
- but upload to cloud NOK
ANSWER : No. We had very very few reports about this issue (about 2-3) on all the thousands of
installation we have. Implementing such script could cause more issues instead of removing them.
37
39. Fetch the backup from the Cloud
Need to reinstall a
site from a backup
stored on Amazon S3
?
Easy : on the page
listing the backups,
just click on the
button “Manage
remotely stored files”
39
40. Fetch the backup from the Cloud
Then a popup
window appears,
giving 3 possibilities
Fetch back to
server
Delete
Download to your
desktop
40
41. Fetch the backup from the Cloud
But of course, if you have done things properly, the
“credentials” entered in the Configuration only allow
to write on Amazon S3
Therefore, when clicking on “Fetch back to server”,
you'll get an error message (see screenshot)
Then how to do it ? Here are 3 methods :
1. The easiest and quickest way : click on the button
“Import Archives from S3” (next slide)
2. Simply change the “credentials” in the Profile
Configuration (only during the download, right
afterwards you don't forget to put back in place the
« write-only » credentials !)
3. Or you manually download the file from S3 and upload
it to your site
41
42. Fetch the backup from the Cloud
Import the backup from the Cloud directly from the Control
Panel of Akeeba
https://www.akeebabackup.com/documentation/akeeba-backup-
documentation/ch03s02s05s03.html
Click on the button "Import Archives from S3"
On the new page, enter the “credentials” of a User having all
rights
If needed, see hereafter in the Cyberduck chapter how to
create such a User
IN all cases, by principle never use the “credentials” of the “root”
user
Click on the button "Connect to S3"
You can then navigate within the Buckets and the files (choose
Bucket, then click on the button "Change bucket")
Click on the chosen file in order to fetch it
Then the file becomes available in the list of all backups, just like
the local backup files
NB : How to delete/reinitialize those Access Key et Secret Key
after use? Easy : nothing to do : the keys are stored in the
Session. So they are automatically deleted as soon as you
disconnect…
42
44. Encryption via Akeeba Backup
A backup done with Akeeba Backup has
typically a “.jpa” extension
The format “.jpa” has no encryption, so
anybody having a copy of a .jpa file (be it on
your computer, your server or your Amazon S3)
can read and install it
Therefore, if you want to go one step beyond
for security reasons, you can use the encrypted
format “.jps” which will require a password
(which you should not loose for future use)
https://www.akeebabackup.com/documentatio
n/akeeba-backup/archiver-engines.html
https://www.akeebabackup.com/documentatio
n/akeeba-backup-documentation/jps-archive-
format.html
44
45. Encryption via Akeeba Backup
Beware, do not confuse the encrypted “.jps” format with
the famous “ANGIE Password”
ANGIE is indeed the restoration script of a backup
https://www.akeebabackup.com/documentation/akeeb
a-backup/angie-joomla.html
So, in practice, if your are restoring a backup "live" on a
website, that password will avoid that a visitor would take
your role and access the installation screens
With other words, the ANGIE Password does not encrypt
your file
By the way, if you forgot yourself your ANGIE Password
required for restoration, don't worry : just delete the
following file in order to skip the protection !
installationpassword.php
45
46. Encryption via Amazon S3
Note that Amazon S3 also offers encryption possibilities
http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingEncryption.html
46
48. AWS > interface customisation
Amazon Web
Services covers tens
of different Services
To have your own
shortcuts for the
Services you use
Click on the Edit
button
Drag & Drop the
icons in the menu
bar
48
49. Bucket > Logging, Events, Versioning, Lifecycle
First open your
Bucket, then click on
Properties
You can configure
Permissions
Logging
Events
Versioning
Lifecycle
Just try !
49
50. AS3 > Lifecycle rules
Very useful: Lifecycle
Rules
For example, you can
have your files deleted
automatically after X
days
You can also transfer you
files to other intermediate
storage servers (for
example Glacier, which is
cheaper but does not
give instant access to the
files : a few of hours of
patience explain the
price difference)
50
51. AS3 > make file(s) public
To give access to
file(s):
Select the file(s)
Go to the menu
Actions > Make
Public
On the right, when
the button
Properties is
enabled, you see
the direct url of the
file
51
53. Efficient management of files
The web interface of Amazon S3 is made to create & configure the Buckets, but not really to manage
your files on a day-to-day basis
For example, you can not even sort the files by Date of Name…
In order to manage your Amazon S3 (ie upload/sort/delete/change rights/…), it is therefore better to
use dedicated solutions :
Either software (FileZilla or WinSCP, well-known FTP clients, do NOT manage the S3 protocol)
http://cyberduck.io (Mac & Windows)
http://www.cloudberrylab.com/free-amazon-s3-explorer-cloudfront-IAM.aspx (free/pro)
Either for example browser extensions
S3fox for Firefox seems to be well-known, but I am not sure if it is still developed
http://www.s3fox.net
https://www.youtube.com/watch?v=L1cqzEYYUB0 (demo)
But in all cases, make sure you configure your tool with a dedicated IAM User having only the
appropriate rights (read/write/…) according to what you want to do with it
53
54. Cyberduck > create new user
Of course if you use
a software, it means
that you will
potentially want to
list/upload/downloa
d/delete files
So you should first
create a new user
who will receive only
the wished rights
Save your
credentials
54
55. Cyberduck > new User > Policy
Go to the menu
Services > Users
Open the user
created for
Cyberduck
Click on the tab
"Permissions"
Click on the button
"Attach Policy"
55
56. Cyberduck > new user > Policy > detail
A (very) long list of
standard Policies is
available
To make it simple
here, just check
AmazonS3FullAccess
and validate
NB : in order to find it
quickly, use the
search filter
56
57. Cyberduck > configuration
In Cyberduck, click
on the button "Open
Connexion"
Choose S3
Enter the credentials
Connect
57
58. Cyberduck > create a shortcut (Favourite)
The list of files is shown
You can
Sort (what is impossible with the web
interface of Amazon S3…)
Upload
Delete
Configure the Bucket
Etc
Go to the menu Favourites > New
Favourites if you don't want to re-enter
you credentials every time
58
59. Cyberduck > get access to the shortcut
Thanks to the Favourites, the
connexion will be directly available
for future uses (see screenshot)
59
60. Cyberduck > advanced configuration of AS3
NB : from Cyberduck, you can also
directly configure Permissions,
Lifecycle etc
60
62. Amazon S3 > create notifications (SNS)
You want to be notified for example when one of your websites has uploaded its backup to
Amazon S3 ?
This is of course just a nice to have, not a must…
… but it is feasible
It is the purpose of this chapter, based on a very simple example of notification by email that you can refine
according to your needs
But it is also possible to go further by using APIs or by integrating notifications in your applications
62
63. SNS > create new Topic
Go to
Services > SNS
Create a new Topic
Give it a name
Validate
63
64. SNS > edit topic
Still from Services >
SNS you can edit
topics
64
65. SNS > edit Topic Policy
You need to create
a Topic Policy in
order to authorise
your Bucket to post
to SNS
To keep it simple, just
choose “Everyone”
for the Users who are
allowed to published
on this Topic
65
66. SNS > subscribe to Topic
Subscribe to your
own topic
(Subscription)
Choose Email as
Protocol
Enter your email
address as
EndpointSubscribe to
your own topic
66
68. SNS > confirmation message
Example of that
Subscription
Confirmation
message as received
by the recipient
68
69. SNS > subscription confirmed
By clicking on the link
in the message, the
recipient gets the
confirmation of
his/her subscription
69
70. SNS > add notification
Go back to the
Bucket
Click on the button
Properties
In Events, click on
Add Notification
70
71. SNS > create Event for Bucket
Fill in the fields
Choose which
events will trigger the
notification (in this
example, only “Put”)
For the field “SNS
Topic”, you get
directly in the
dropdown menu the
names of the
created Topics (no
need to write the
ARN manually)
71
72. SNS > example of notification
On this screenshot, example
of notification email sent
automatically after a backup
is uploaded to the le Bucket
This being said, Amazon Web
Services has even more
possibilities, like
Notification by SMS
Integration with other
services (SQS, …)
72
74. Alternative with Installatron
Personally, I am very happy of the combination Akeeba + Amazon S3
But of course, it is possible to find alternatives
For example, the Joomla User Group Vlaanderen has made an interesting topic about Installatron,
which can also be used remotely (« IR ») to create and restore backups
Here is some more detailed procedure :
http://www.jugvlaanderen.be/websites-bouwen/develop-design-content/23-design-development/233-
simpel-backupen-restoren
74
76. Sources > Documentation & Support Akeeba
General explanations of Cloud Backup
https://www.akeebabackup.com/documentation/akeeba-backup-documentation/step-by-step-guides.html
https://www.akeebabackup.com/documentation/akeeba-solo/how-to-cloud-backup-s3.html
Explanations about security in particular when you manage several sites
https://www.akeebabackup.com/support/akeeba-backup-3x/8694-is-amazon-s3-secret-key-a-secret.html
https://www.akeebabackup.com/support/akeeba-backup-3x/9084-secure-back-ups-with-amazon-s3.html
https://www.akeebabackup.com/support/akeeba-backup-3x/8835-how-is-amazon-key-info-saved.html
An « open guide » about AWS
https://github.com/open-guides/og-aws
+ all the links mentioned all along the slides
76
77. Most interesting excepts from these sources
That post describes how to give only the PutObject privilege to the user. You should also add
the DeleteObject privilege for quotas to work :)
The other possibility is to create a write-only Amazon S3 user and use that instead. As you
observed, that would render the quotas ineffective, as the user would be unable to delete old
backups. However this is the most secure option, as a potential hacker can never access your
backups (download or delete).
The only thing you can't do is to transfer the backup archives from S3 to your server for easy
restoration. But this is easy to work around; just go to the Configuration page and enter your
regular access credentials before using the Manage Remote Files feature of Akeeba Backup :)
77
79. A complete backup stragegy
Here we are ! Now you know all you need to know to improve your backup strategy (part of the
security strategy) of your site by having backups which are totally independent of your site
You are now protected from technical problems, from mistakes… or from hackers
But don't forget that the security of a site also entails many other dimensions than the backup
strategy. See for example
https://slides.aesecure.com
Suggestions, ideas for improvement, corrections about this presentation ?
Don't hesitate to let me know !
79
80. Thank you
Don't hesitate to get in touch !
Marc Dechèvre
+32 474 37 13 12
marc woluweb.be
Skype : woluweb
woluweb.be
twitter.com/woluweb
facebook.com/marc.dechevre
linkedin.com/in/marc-dechevre-68b8172a
80
81. Oh, one more thing…
81
Hidden heart in the Joomla!® logo – sources
Joomla! Magazine – august 2014