Import and Export MySQL and PostgreSQL

There are a number of ways to export and import data from a database. You can use a Mac client like Sequal Pro for MySQL, for example. You can do the same for PostgreSQL, though I have not settled on a Mac client for PostgreSQL.

Mac clients aside, you need can import and export your database SQL from the command line or from a bash script on your server as well. I use the following commands to handle importing and exporting SQL on my servers and development machines.


If you want to export PostgreSQL:

pg_dump -U [username] [database] -f [filepath]

If you want to export MySQL:

mysqldump -u [username] -p[root_password] [database] > [filepath]

Note that there is no space after -p in the MySQL command.


If you want to import PostgreSQL:

psql [database] < [filepath]

If you want to import MySQL:

mysql -u [username] -p[root_password] [database] < [filepath]


You can read about the backup process for PostgreSQL on their website for more details.

For MySQL, the documentation is broken up into multiple pages. One for exporting or dumping your database and another for importing your exported database.


Backup Strategy using Amazon S3 and Cron Jobs

Backing up files and storing them in a remote location should be done on regular basis. Backups are vital. They ensure your code and assets are available if your server is hacked, destroyed and when your data is lost or corrupted.

In WordPress you could us a plugin like BackupBuddy to manage your backups. However, not everyone uses WordPress or wants to use a plugin to manage their backups.

You do not need to be a server guru to manage your own backups. All you need is one shell script on your Linux installation and one cron job to run it and an Amazon S3 bucket.

If you don’t feel comfortable managing your own backups I would use a plugin, system admin or service. Backups are too important to shortcut.

In this tutorial

Use a basic Ubuntu server. I’m using root at my own risk but you should use your own su.

  • Install AWS CLI so we can upload your files to S3
  • Create a shell script to automate backups
  • Create a server cron job to backup your site every day

Installing AWS CLI

To install the AWS CLI make sure you have python and pip installed, Python 2 version 2.6.5+ or Python 3 version 3.3+.

Here are the simple steps:

  1. Install the AWS CLI via pip: sudo pip install awscli
  2. Configure AWS with your credentials: aws configure
AWS Secret Access Key [None]: wJalrXUtn4MI/K7MDGENG/bPxRfiCY4EXAMPLEKEY
Default region name [None]: us-west-2
Default output format [None]: ENTER

To test if you can access your buckets run aws s3api list-buckets. this will list your buckets in json format if you have permissions.


Now, get the aws cli location.


which aws



Use this is your script to access aws.

Backing Up Files

To backup the website or app files on the server we will use tar. Everything will be on S3 so we will only keep only the last 4 days on the server by running some basic cleanup. Take a Look at the shell script located at /root/


BCKTIME=$(date +"%y-%m-%d-%H%M%S")
echo "backup to file: $BCKFILE"

echo "backing up files..."
cd /var/www/
tar czf "$BCKFILE" html --exclude='.git'

echo "moving backup..."
find /var/www/full_backup_* -type f -exec mv {} /root/bck/ ;

echo "moving to safe place..."
cd /root/bck/

echo "cleanup old backups..."
find /root/bck/full_backup_* -type f -mtime +3 -exec rm {} ;

echo "uploading new file to S3..."
/usr/local/bin/aws s3 cp "/root/bck/$BCKFILE" s3://your-bucket/

echo "finished!"

Remember make the script executable, chmod +x /root/


Crontab or cron jobs are really easy if you know what the 5 asterisks (*) are for.

minute hour day month day-of-week [command]
* * * * *

For the files backup we want the cron job to run at 11:30 everyday. You can access your users crontab with crontab -e.

30 11 * * * /root/ >/dev/null

Allow cron access to AWS CLI

Now you need to inform cron of your AWS credentials. Locate your AWS config file and edit the cron.

crontab -e

Add the env to the top of your crontab.

30 11 * * * /root/ >/dev/null

Backing Up PostgreSQL

Backing up the database is also important. Since I’m using PostgreSQL ill use pg_dump. These backups will stay on the server a 2 days and then cleanup will remove them. Take a look at the shell script located at /root/


BCKTIME=$(date +"%y-%m-%d-%H%M%S")

echo "backup db to file: $BCKFILE"
pg_dump -U username database -f "/root/db_bck/$BCKFILE"

echo "compressing..."
cd /root/db_bck/
tar czf "$GZFILE" "$BCKFILE"
rm "/root/db_bck/$BCKFILE"

echo "cleanup old backups..."
find /root/db_bck/db_backup_* -type f -mtime +1 -exec rm {} ;

echo "uploading new file to S3..."
/usr/local/bin/aws s3 cp "/root/db_bck/$GZFILE" s3://your-bucket/

echo "finished!"

Again make the script executable, chmod +x /root/

Edit the cron job one last time to run on the hour every hour backups. Again, crontab -e.

30 11 * * * /root/ >/dev/null
0 * * * * /root/ >/dev/null

To check your final cron results list your crontab.

crontab -l

Backing Up MySQL

If you want to backup MySQL as well replace:

pg_dump -U [username] [database] -f "/root/db_bck/$BCKFILE"


mysqldump -u [username] -p[root_password] [database] > "/root/db_bck/$BCKFILE"

Note that there is no space after -p in the MySQL command.

Backup Security

If your backups includes credit card numbers, email addresses or passwords you will want to encrypt your backup before sending it to a remote destination. However, local backups need no encryption if your server is secure.

Cleanup S3 Backups

Chances are you will not want to keep your backups forever. On the local server we have already set a removal date for the old backups. On S3 we need to do the same.

Within S3 locate your bucket properties and find the “Lifecycle” option. Then just create a rule and you can delete old backups any number of days after they were created.

Here I created a rule called “Delete Old Backups” that runs on the whole bucket and permanently deletes items created 3 days ago.

S3 Lifecycle

Installing Xdebug for PHP7

Xdebug deepens debugging PHP apps and websites to a level you can’t receive from the manual process of using code level var_dump().


To install Xdebug for PHP7 on Ubuntu you will need to do so manually. Ubuntu 15 and lower will not come with a package for PHP7 or its xDebug counterpart.

First, be sure you are using PHP 7 already by checking your version.

php -v

Should give you something like:

PHP (cli) ( NTS )
Copyright (c) 1997-2015 The PHP Group

In my case I’m using the Larval Homestead PHP 7 branch with Vagrant and the VirtualBox provider. So, I have PHP 7 already installed and working with Nginx.

Get your php.ini

Next output your php.ini information into a file or place you can get to the information from. I like to save mine to a file called php-info.txt.

sudo php -i > ~/php-info.txt

Use the Xdebug Wizard

Send the text file information into the wizard at Xdebug Wizard. Then follow the instructions the wizard supplies.

Also, if you are using OSX you may need to run brew install autoconf.


The instructions should look something like:

  1. Download xdebug-2.4.0.tgz (I like to use wget -O ~/downlaods/xdebug-2.4.0.tgz on Ubuntu)
  2. Unpack the downloaded file with tar -xvzf xdebug-2.4.0.tgz
  3. Run: cd xdebug-2.4.0
  4. Run: phpize (See the Xdebug FAQ if you don’t have phpize.)

As part of its output it should be like:

Configuring for:
Zend Module Api No:      20151012
Zend Extension Api No:   320151012

If it does not, you are using the wrong phpize. Please follow this FAQ entry and skip the next step.

  1. Run: ./configure
  2. Run: make
  3. Run: cp modules/ /usr/lib/php/20151012
  4. Edit /etc/php/7.0/cli/php.ini and fpm and add the line
zend_extension = /usr/lib/php/20151012/

Enable Remote Xdebug

To use Xdebug remote debugging on a host computer you need to enable remote debugging on the guest server. In my case the guest is the Vagrant Homestead VM.

In the guest php.ini file add:

xdebug.remote_enable = 1
xdebug.remote_port = 9000

Since I use PhpStorm for Debugging I set my Xdebug key to PHPSTORM. In the browser, the Xdebug and PhpStorm will look for a cookie called XDEBUG_SESSION with the value PHPSTORM. To set this cookie I use the Chrome Browser extension Xdebug helper.

Reboot Services

To load the new configurations reload your PHP and HTTP server services. In my case that is nginx and php7.0-fpm.

sudo service php7.0-fpm restart
sudo service nginx restart

Configure PHP Storm

Now that Xdebug is install we are ready to configure a “PHP Remote Debug” for PhpStorm.

  1. Under PhpStorm &gt; Preferences &gt; Languages &amp; Frameworks &gt; PHP &gt; Debug be sure Xdebug is set to port 9000
  2. Under PhpStorm &gt; Preferences &gt; Languages &amp; Frameworks &gt; PHP &gt; Servers add a server using port 80 and set the Absolute path on the server.
  3. Under Run &gt; Edit Configurations... add a “PHP Remote Debug”
  4. Set the server and the Ide key(session id) to PHPSTORM

Using PhpStorm with Xdebug on a Remote Server

To start receiving requests from the remote server enable “Debug” with Xdebug helper in Google Chrome on the page you want to debug. Then, unlike local testing that uses “Start Listening for PHP Debug Connections” use Run &gt; Debug instead.

Using the “Debug” command will open up the signal manually and open the direct connection to the remote server.

Set breakpoints and you are off.

Out of Network Servers

If you are trying to remote debug a website with Xdebug from outside your local network be sure the 9000 port is not blocked on your network. This is typically done at the router level.

As always consult the Xdebug remote documentation.

Next, ask Google “what is my public ip” and it will tell you.

Then be sure your remote server is not blocking your IP address.

Simple Image Optimization and Reduction

Image optimization and reduction is critical to website performance and speed. File minification and gzip work well for text bases assets like HMTL, CSS, SVG and Javascript; but images don’t take advantage of these types of compression.

To minify images I like to use reducers and compressors that don’t visibly damage image quality.

Mac Image Optimization

On OS X I like to use ImageOptim for a solid image pass through. I also use ImageAlpha for PNG images that require more refinement.

Photo of imageoptim


Ubuntu Server Image Optimization

For Ubuntu Servers where I’m dealing with client uploaded images or images that are not contained within an application I use two image optimizers to pass over and help a little. This technique is not meant to be highly effective.

These tools are jpegoptim and optipng. To install them on Ubuntu use apt-get.

apt-get install optipng
apt-get install jpegoptim

Ubuntu apt-get installs an old version of optipng but I have not had any issues with it.


For JPEG files jpegoptim works well when used with the find command.

find YOUR_DIR_NAME_HERE -iname '*.jpg' -type f -exec jpegoptim {} ;

For PNG files I use optipng.

find YOUR_DIR_NAME_HERE -iname '*.png' -type f -exec optipng -o3 {} ;