Upgrading VPS Web Server from Ubuntu 10.04 to 12.04

I am not a “real” system administrator but I manager sever VPS servers for my own web sites. This is what I did to update from Ubunutu LTS 10.04 to Ubunutu VPS 12.04.

  1. I already have all the code and databases backed up, off the server, regularly. I backed up various things to another machine: sites-available files, cron directories, .bashrc, .profile, keys.
  2. I tried running the preferred upgrade command

    [bash]do-release-upgrade[/bash]

    But got the message

    -bash: do-release-upgrade: command not found

    Then I ran

    [bash]sudo apt-get install update-manager-core[/bash]

    Then I was able to run [bash]sudo do-release-upgrade[/bash]

  3. Which resulted in this somewhat scary message

    This session appears to be running under ssh. It is not recommended to perform a upgrade over ssh currently because in case of failure it is harder to recover.

    If you continue, an additional ssh daemon will be started at port ‘9004’.
    Do you want to continue?

    I look around on the web for advice. What’s the risk of upgrading over SSH? seems a good recap. I then continued. I then got this message

    To make recovery in case of failure easier, an additional sshd will be started on port ‘1022’. If anything goes wrong with the running ssh you can still connect to the additional one.
    If you run a firewall, you may need to temporarily open this port. As this is potentially dangerous it’s not done automatically. You can open the port with e.g.:
    ‘iptables -I INPUT -p tcp –dport 1022 -j ACCEPT’

  4. 12 minutes after the upgrade started, the download was complete and I got a message to decide if I wanted to manually approve every restart required or just automatically approve them all. I chose the option to have all restart automatically.
  5. Then lots of files were installed and I was asked about various files; where I (or some script) had changed (or deleted) the default file and now the upgrade wanted to replace the existing file. I had to guess what to do in those cases. They let you look at the diff between your existing file and the proposed overwrite. I think I would like it if they default behavior was to create a backup of the file in that same directory (you choosing whether to set your file or the updated file as active).

The entire process took exactly 30 minutes, with the system restarting in Ubuntu 12.04.

Thankfully the upgrade seems to have gone without causing any problems. This is normally the case. But, even with a very small likelihood of encountering issues it is worrisome as those issues might pose some serious problems. Especially for someone with very limited system administration ability.

I am extremely thankful for all the programmers that created the code to make this process so straightforward and reliable.

[bash]lsb_release -a[/bash]

lets you see the current Ubunutu version you are running.

When I updated a server that had Ruby on Rails applications everything almost worked fine. Passenger was unable to load the application talking about

libmysqlclient_r.so.16: cannot open shared object file: No such file or directory – /usr/local/rvm/gems/ruby-1.9.2-p290/gems/mysql2-0.3.11/lib/mysql2/mysql2.so

I was able to find the solution to that issue was to uninstall the mysql2 gem and then run bundle update: which worked great. The uninstall would work for me unless I was actually root (sudo didn’t work), so I used su to login as root and then

[bash]gem uninstall mysql2[/bash]

Then I went back to being my other user ran bundle update

[bash]bundle update rake[/bash]

Then everything was working fine.

Related: Keeping Your Hosted Ubuntu Web Server Software Up to DateAdding a Key to Your Server for SSH LoginChecklist: Setting Up a New Domain on VPS

Setting Up 2 Factor Authentication for Gmail

Sadly there are significant risks to having valuable information accessible via the internet. Someone with access to your accounts is a huge risk and it can be a huge hassle to deal with what they can do with that access. Email is one of the most critical, there is tons of information you need, and often the email account can be used to get access to many other accounts (obviously financial sites are also critical, as are others). You should set up 2 factor authentication for critical sites.

2 factor authentication is made of up 1 thing you know, and 1 thing you have. So you know your password. The easiest other item is a smart phone (that is the default option for gmail and most places for the thing you have). Some banks and computer networks (intranet access) use security tokens. Either way they give you a code which you then enter after you enter in your username and password.

Gmail offers a app for smart phones that will generate these codes even if you don’t have cell phone service. Also, if you don’t have a smart phone, or lose yours there is a backup thing you can have (pre-printed codes). Make sure you print these and keep them somewhere you can access them (Google suggests your wallet).

With Gmail you will be prompted for your 2nd level authentication if you attempt to access with a computer that has not been authorized previously. If you are using a computer that you logged into your account with and told Google to authorize you will login normally. Every 30 days Google will then have you provide the 2nd level authentication (I think this is right, I might not have it exactly right for when Google asks for 2nd level authentication).

Whatever email account you have I would strongly recommend having 2 factor authentication setup. And you then must be sure that you can gain access. It is a bother but you need to make sure you understand the process and understand how to cope with issues (losing your smart phone, etc.).

Once you setup your account for 2 factor authentication make sure you add a secondary phone number backup (in case you lose your phone or it is stolen). And make sure you print out the backup code.

Process:

Continue reading

Very Simple Process to Claim Authorship of Pages with Google

Update: August 2014 Google abandoned this project so there is no point wasting your time on it. As I have stated elsewhere Google’s efforts are so unreliable (Google nearly always fails and then closes whatever you invested your time in) that you should careful consider if you bother with their stuff (much mores than any other company). This has been true since Larry Page became the CEO and means that I avoid Google initiatives. I also abandon, or at least don’t waste more of my time investing in anything Google buys. Without a strong extra case to justify bothering with something from an organization proven to be so unreliable it just isn’t sensible to treat Google efforts as reliable.

With authorship I figured the odds were that even with Google’s poor track record this was something worth bothering with. Oops. I have to re-calibrate to assume even less chance of Google making something that won’t just be a waste of my time for future efforts.

There are lots of posts explaining how to claim authorship of pages with Google. I find them very complex, by and large.

I was able to use this simple process.

  1. Include by author text with a link to the author page using rel=”author” In WordPress you can just edit the theme page to have the author link include the rel=”author” tag.
  2. [html]<a rel="author" href="http://johnhunter.com/">John Hunter</a>[/html]

    For WordPress blog with multiple authors here is the syntax to use (this pulls the author url from their profile. The author can update their web site url when logged into the blog.

    [php]<a rel="author" href="<?php the_author_meta(‘user_url’); ?>"><?php the_author() ?></a>[/php]

  3. Then connect author page to Google+ profile. You can make this most any page (Google may exclude some free sites). Most people use the about page or author pages on the same blog they are trying to claim authorship of. But all that really matters is linking this page to your Google+ profile (obviously substitute your url), I have many sites with my material so my home page is what I used, I just added the following to that page.

    [html]<a href="https://plus.google.com/u/0/123" rel="me">Google+</a>[/html]

  4. Then link your Google+ profile to the authorship page you want to use. You can add it under Other profiles or contributor to. This is the trickest part do to very bad UI of Google+ you have to go into edit mode and then click on the areas (there is no indication they are editable until you click – extremely bad UI, Google seems to very much like this hard to understand hidden UI elements lately, hopefully that will end soon). Likely Google will fix this at some point so this part will no longer matter.

To test out whether things are working you can go into Google webmaster tools to the rich snippets area. Test out a url you claimed authorship of and you should see something like:

search result with photo

Google can now use your Google+ profile to include a photo (from your Google+ profile) next to search results

With those simple changes over 1,000 posts on my blog were updated. It took far longer to search for and read a bunch more complex ways of accomplishing this than actually doing this once I was able to see what little had to be done. Writing this post took way longer. Now I just have to do step one for other sites and blogs and they will all be updated to show my authorship (for those I already had on my Google+ profile, if they were not listed there yet they have to be added).

There are many benefits to establishing authorship through Google.

  • Google can include your photo next to search results. Especially until people get tired of this, it will likely increase clicks on your links. And even after people are tired of it, if they notice your photo (and respect you) that may well increase clicks.
  • Builds your personal brand
  • Google can use authorship as an additional factor in calculating the worth of a page for a specific search.
  • Google can get a feel for what areas you are an expert in. First by just analyzing what your write about. But more importantly they will be able to use this extra layer of information to determine AuthorRank (similar idea to PageRank) by seeing links to your authored material.
  • Hopefully this can allow for RSS feeds by author (no matter where the content is published) eventually, if it doesn’t already.

Knowing authorship will allow Google to improve search result quality, so I can understand why they are pushing for it. It is a bit annoying how they keep pushing Google+ but this implementation seems tolerable. Some posts make it sound like you have to make your “authorship page” the Google+ profile, but really you can decide the authorship home page (for me johnhunter.com). They do pull your profile photo from Google+ to use, which is less than ideal.

Installing MySQL on Snow Leopard

I had all sorts of trouble installing MySQL on Snow Leopard. I never had any trouble installing mysql on Ubuntu.

Installing it normally on Snow Leopard would give me a success screen saying it had been installed but it seemed to have completely failed to install.

I then installed and used macports and that was the start of the solution. If I was trying to install MySQL on Lion or Mountain Lion I think I would try macports first.

You need to get xcode (to use macports) if you didn’t have it already you are going to have some challenges because I couldn’t find it in the Apple developer center (they removed it since I guess you shouldn’t be still using Snow Leopard in their opinion). Luckily I already had xcode installed.

To install mysql using macports

[bash]sudo port install mysql5[/bash]

Then you will avoid a bunch of frustrating errors (though you may enjoy that process) if you realize you need the server msql also.

[bash]sudo port install mysql5-server[/bash]

Pay attention to the messaging when you do that and you will see, suggestion to

[bash]sudo -u _mysql mysql_install_db5[/bash]

See the instructions that show for lots more on how to deal with that failing, as it did for me. Or celebrate if it worked.

Go ahead and add mysql to the default path open your .bash_profile or .profile

[bash]vim ~/.profile[/bash]

Then I tried sudo /opt/local/share/mysql5/mysql/mysql.server start but that failed. At which point I found that macports has its own version

[bash]sudo port load mysql5[/bash]

Which worked.

To login to MySQL you can use (root is created with no password on install).

[bash]mysql5 -uroot -p[/bash]

Go ahead and change the root password as the first thing you do.

Related: Getting Ruby on Rails & mysql setup on a Mac OSX Leopard clean installMacPorts doesn’t install org.macports.mysql5.plist with mysql5 +serverMySQL Performance Tuning TipsMySQL cli Syntax

Set the Web Server to Use the Domain Without www

The main reason to bother with this is to help search rankings. Sub-domains (for example, www.curiouscat.com and curiouscat.com) are treated as separate websites even if you have entirely the same content displayed for both. If 20% of the links to your site use the www and 80% don’t then your sites ranking by search engines is less than it would be if it was just treated as one site.

You can set the domain to use in Google webmaster tools. But that doesn’t do anything for all the other search engines. Also if you have both, some reports some will keep statistics separately for the non-www and www domain (Google Adsense does this, for example).

Using virtual hosts file (sites-enabled)

For Apache you can place the following code in your virtual hosts file (in the sites-available directory under Apache).

Replace curiouscat.com with your domain name.

[bash]<Directory /srv/www/curiouscat.com/public_html/>
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.curiouscat.com$ [NC]
RewriteRule ^(.*)$ http://curiouscat.com/$1 [R=301,L]
</Directory>[/bash]

Then you need to capture the update and reload Apache.

Disable the site (in order to enable it with the updates)
[bash]sudo a2dissite curiouscat.com[/bash]
Enable the site
[bash]sudo a2ensite curiouscat.com[/bash]
Reload Apache (new way – Ubuntu 12.04)
[bash]sudo service apache2 reload[/bash]
Old way to reload Apache was > sudo /etc/init.d/apache2 reload

If when you try to reload you get this error message:

“Invalid command ‘RewriteEngine’, perhaps misspelled or defined by a module not included in the server configuration” then enable modrewrite on apache:

[bash]sudo a2enmod rewrite[/bash]

You should then be told to restart apache

[bash]sudo /etc/init.d/apache2 restart[/bash]

Using .htaccess

Or you can include the following in your .htaccess file

[bash] RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.curiouscat.com$ [NC]
RewriteRule ^(.*)$ http://curiouscat.com/$1 [R=301,L][/bash]

Related: Checklist for Setting Up a New Domain on VPSPhusion Passenger Tips and Troubleshooting Ideas

If you wanted to force www to be used instead just flip the regular expression around:

[bash]<Directory /srv/www/curiouscat.com/public_html/>
RewriteEngine On
RewriteCond %{HTTP_HOST} ^curiouscat.com$ [NC]
RewriteRule ^(.*)$ http://www.curiouscat.com/$1 [R=301,L]
</Directory>[/bash]

Customizing the Command Prompt for Terminal in Ubuntu

To customize the look of the command prompt for Terminal in Ubuntu you can insert code into .bashrc (if you are having trouble with updates in .bashrc not working you may want to see if it is set in .profile – thus overriding your .bashrc). Those files are found in your user directory /home/[username]

[bash]echo $PS1[/bash]

will show you the current settings. You can make a change directly from the command line but it will only work for that session. For example:

[bash]PS1="\d\w $ "[/bash]

This is the code I have in my .bashrc related to the terminal prompt

[bash]# set variable identifying the chroot you work in (used in the prompt below)
if [ -z "$debian_chroot" ] && [ -r /etc/debian_chroot ]; then
debian_chroot=$(cat /etc/debian_chroot)
fi

# set a fancy prompt (non-color, unless we know we "want" color)
case "$TERM" in
xterm-color) color_prompt=yes;;
esac

if [ -n "$force_color_prompt" ]; then
if [ -x /usr/bin/tput ] && tput setaf 1 >&/dev/null; then
color_prompt=yes
else
color_prompt=
fi
fi

if [ "$color_prompt" = yes ]; then
PS1=’${debian_chroot:+($debian_chroot)}\[\033[01;32m\]\u@\h\[\033[00m\]:\[\033[01;34m\]\w\[\033[00m\]\$ ‘
else
PS1=’${debian_chroot:+($debian_chroot)}\u@\h:\w\$ ‘
fi
unset color_prompt force_color_prompt[/bash]

Testing Email Using The Ubuntu CLI

For a Virtual Private Server (VPS) or any other web or other server you should have alerts sent by email for various things. So for example if you automate security updates for an Ubuntu web server you want to be notified if there is some issue with the automatic update.

In order to check and be sure email is setup and working on the sever there is simple command line code to use:

[bash]mail -s Test[/bash]

you then get a prompt to enter

To: [enter the email and press return]

Cc: [press return]

then a blank where you can type in any text you want in the body of the message.

Then you send the message with by pressing CTRL-d

[bash]CTRL-d[/bash]

If you don’t receive the email then you can troubleshoot what is going wrong.

Keeping Your Hosted Ubuntu Web Server Software Up to Date

To setup automatic security updates follow the instructions from Ubuntu.

To run security update manually

[bash]sudo unattended-upgrade[/bash]

A full update of all packages can be done using the follow, remember this may create some issues is one update makes something else you have no longer work properly. You should test to make sure things all work after the updates (for production systems obviously you should test things before [first updating the staging server to make sure the updates don’t cause any problems] and after the updates).

First update the local package index (to find what needs to be upgraded).

[bash]sudo apt-get update[/bash]

Then upgrade the software.

[bash]sudo apt-get upgrade[/bash]

Adding a Key to Your Server for SSH Login

Using Ubuntu

An authentication key allows your server to authenticate the computer you are using has the right key and should be granted access. This lets you use the key instead of a username and password when using ssh.

If you don’t already have a key on your local machine (look for a file named id_rsa.pub in your user home directory under the .ssh folder

[bash]cd /.ssh[/bash]

will get you to the right directory) then you need to generate the key pair. On your desktop machine use:

[bash]ssh-keygen[/bash]

Next you copy the file to your server. scp ~/.ssh/id_dsa.pub [user]@[server]:.ssh/

[bash]scp ~/.ssh/id_dsa.pub username@servername:.ssh/[/bash]

Rename the file on the server to authorized_keys2
[bash]mv id_rsa.pub authorized_keys2[/bash]

The key is for to authenticate your computer. But on the ssh login Ubuntu will look in the user folder. So if you also had user2 access to the server and tried to ssh into the server you would not be authenticated because it would look in user2/.ssh for the authorized key file and not find it. You can put the same key in any user folder on your server to have that user also be automatically authenticated.

WordPress: Multiple Blog Network on One Server – Overcoming Conflicts

I ran into a problem when I added a second WordPress blog network to my server. I had the Curious Cat Blog Network up and running for quite some time with sub-domains for each individual blog in the network. WordPress automatically dealt with routing the sub-domains and having urls work. It really is very nice how easy it is to create a new blog and have everything up and running – just add it in WordPress, no need to touch the server directly. Blog networks are a new feature in WordPress 3.0 (I think) which are very nice. I would imagine it builds on effort with Wordpres MU but it is just part of regular WordPress now.

When I added the second blog network however the new faux-sub-domain that should be used affordable-funeral.moneyite.com would instead be redirected to curiouscatnetwork.com and since no such domain existed on curiouscatnetwork.com it gave the standard error message WordPress generates for the case where a sub-domain url is not recognized.

The main domain for the new site was working: moneyite.com. I tried searching around for some solutions to this problem online but couldn’t find any. I am not sure if multiple wordpress blog networks should work on the same server without any special needs. But it wouldn’t for me. I found a solution that did work so I will share what worked for me.

I created new sites-available records for each of the sub-domains and once you reload Apache everything seems to work. I am not sure their isn’t some problem with doing things this way that I haven’t uncovered yet. But it is working for me so I wanted to share this in case it can help anyone else trying to use multiple wordpress blog networks on one server.

Related: Checklist for moving an existing WordPress site to a new web hostWordPress error: Image could not be processed. Please go back and try again