Some years ago, a fellow tech director asked me why I would go to the trouble of writing custom scripts if they would all periodically have to be updated to keep up with changing technologies. At the time, I replied that the upgrade paths for commercial web programs were typically not painless, and that I thought that the upgrades would not take very long compared to the time invested the write the scripts in the first place. However, I have never had the chance to put this theory to the test until today. Fortunately, I have been proven right so far.
Last week, I installed the updated framework of IIS6, PHP5, mySQL5, and PERL 5.8.8. Today, it has taken only half a day to restore functionality to about three quarters of the insideUHS web scripts. I have learned a few lessons along the way.
IIS6 has added Web Service Extensions, which must be enabled for each script extension you want to be able to run (e.g., .cgi, .php). This seems obvious in hindsight, but I somehow managed to lose about half a day getting scripts to run on the new server. I was thrown off by the “file not found” error messages that IIS6 returns for the sake of security-by-obfuscation.
D’Arcy Norman provided a quick rundown on upgrading mySQL that I used to move from mySQL4 on the old server to mySQL5 on the new. My transition was a bit tougher than his. I learned that it is important to use the -Q option he specified, as some column names conflicted with mySQL reserved words. I learned to run the mysql_fix_privilege_tables script in order to upgrade the mysql user database to the new version. I somehow barely missed out on a slightly newer version of mySQL that has a mysql_upgrade script included in the distribution.
I was also hung up for a while on mySQL import errors while moving data from the old server to the new. Seems that some students had input dates of 2005-02-30 and 2006-06-31 into date fields, of which mySQL6 is less forgiving. I fixed all these, but then I ran into an “out of range” error that was harder to troubleshoot. Having a hard time fixing all the bad data in our rather large database, I searched the web for a workaround and found that I could temporarily disable strict mode in my.ini for the import. I then restored strict mode and everything worked fine. I suppose the bad data may produce an error again one day, but then it should be a lot easier to localize and fix within the context of a working web app. Lots of last year’s data won’t ever be viewed again. If the problem is a poorly-written open-source script, then a new version should do the trick.
I am still getting used to the new security feature of IIS6 which suppresses PERL compilation errors because they do not include a full content-type header. I still want to search ActiveState for a way to temporarily enable headers on error messages when debugging scripts. However, I have found a couple of other ways to debug these scripts. First off, you can see the compilation errors by running scripts from the command line, e.g.
c:\> perl test.cgi. If you need to pass parameters to the script to produce the error, then the format is different from a URL query string, e.g.
c:\ perl test.cgi action=submit. This is not going to scale up to a full form, but it resolves most script issues.
Most PERL script issues I have found are related to libraries I used on the old server that I must reinstall on the new server. The command-line error messages are very informative in this case. I love the ActiveState Perl Package Manager (PPM). Just run ‘ppm’ at the command line, search for the module you want, and then install it. For example,
It’s a good idea to search first before installing. I was expecting to use the colon format DBD::mysql but discovered that ppm uses dashes instead, e.g., DBD-mysql. It’s a really fast way to automatically find and install PERL modules into your Windows PERL distribution.
A few scripts have required some easy path edits, since I altered the web directory structure a little bit to better organize static and dynamic content. This was previously motivated by the need to store PHP scripts separately from HTML docs in order to keep database passwords out of directories to which users have read access. I have been good about naming paths to essential data files and templates at the top of each script, so it has been easy to fix those for each script.
This reminds me to mention that Windows has not satisfactorily addressed FTP security problems. Instead of permitting me to set FTP permissions by user account, I only have the option of making users be restricted to their home directories for FTP — and this is not trivial to do. That’s why I had to remove the PHP scripts from the web directories in the first place. Otherwise, users would be able to download PHP source code and look for juicy database passwords contained therein. FTP still relies solely on directory permissions, which are not helpful when you want to give users HTTP read access but no FTP access of any sort. Given the rise of web-based CMS systems, we may completely eliminate FTP this year. the number of FTP users has declined in recent years anyway as Moodle and blogs have taken off.
Onward with the web migration.