Category Archives: php-mysql

Get list of cases in a PHP switch statement

Like the title says, you have a PHP script where a (supposedly long) case switch statement is placed, and you want to programmatically get a list of all the strings for each case.

My use case is, I have a Telegram bot for my own private use, which does several actions, all different among themselves, when receiving user input from a specific account (mine).

The command strings are predetermined, so I have a -long- list of cases like so:

switch(strtolower($text)) {
    case "blah1":
    case "blah2":
    case "blah3":
        echo file_get_contents($someurl);
    // [...]
    case "blah12":
        exec('php somescript.php > /tmp/somescript.log 2>&1 &');

I am adding new commands all the time, with the most different functions, and I might even forget some neat functions exist, so I wanted to implement a function where, in reply to a certain command, the bot lists all possible other commands (a --help function of sorts if you might).

They cannot be simply put inside an array, to be neatly listed at my pleasure, not with a useless excercise of my patience. Also, the listing functionality comes second, the first utility is semantical appropriatedness.

Let’s first say that there is no builtin function in PHP to get a list of cases in switch statement but you can still hack a function yourself.

The following solution is very ugly, but it will work on a simple code, and I would use this only if both of following conditions are verified:

  1. You are the only user of the script (my telegram bot takes into account commands only if they come from my account)
  2. the whole code in the script file basically revolves around the case switch and not much else

This scenario perfectly fits my case, so here’s what I did:

preg_match_all('/case \"([a-z0-9\s]+)\"\:/', file_get_contents(__FILE__), $matches);

You then can use:

foreach ($matches[1] as $casestring) {

or rather, as I actually did in the end, I simply returned:


Remove qTranslate from WordPress and keep translations

This website goes BACK in time, hell if it does. It started maybe in 2001, with static HTML. Then I played for a while with SHTML, before jumping head on to PHP and writing my own, ugly CMS.

I already had italian and english articles translated in here, and when I switched to wordpress (oh, joy and pain) I found this nifty plugin called qTranslate that kind of automated the translation management.

Looked like a good idea back then, so I installed it and moved all bilingual articles in it.

Yeah, looked like a good idea back then.

After a while though, as WordPress updates progressed, I noticed how I couldn’t write new posts anymore because the custom editor changes broke… either that, or I had to manually add the language tags, or I had to keep back the version of WordPress not to lose editor function. NO BUENO!

Until qTranslate stopped working altogether, and sorry guys, it’s not mantained anymore, f*ck you!

Luckily qTranslate-X was released, giving some more oxygen to my rare yet constant contribution to this blog.

Then, guess what, even qTranslate-X was discontinued.

Luckily qTranslate-XT came out, and it’s on GitHub, so as far as I see it’s actively followed, developed, improved… stil it doesn’t cut it for me.

I mean, developers are doing a GREAT job… can you imagine what a huge hassle is following development of a tool so complex, and coordinating the efforts of several people, while trying to keep the code working after major WordPress updates are released?

There must be a ot of people who thank anything that is sacred for qTranslate-XT’s existence.

I’m not one of those, since especially lately, I’m either releasing articles in italian, or in english, so there is not a lot of translations going on.

Everytime I searched for methods to remove qTranslate, every strategy involved choosing a language to keep, and just thrasing the others! As if I didn’t invest a LOT of time translating! Why should I waste al this work?

I used to think the work to do that by myself was going to be immense so I never tried, until today, when I achieved the objective, and am now happily composing, on Gutenberg editor, with a qTranslate-less version of my blog, where every article has been kept, and the URL redirection of “” has been fixed and redirected to “”.

What’s the strategy? Well, I built the code for my own website (and I am NOT willing to customize it for yours, unless you offer to pay me), so these are the premises:

  1. Not every article is bilingual, on the contrary most are either in english or italian
  2. I obviously want to mantain both languages
  3. I don’t care if the blogroll will show both italian and english articles (some the translation of the other) in the same sequence
  4. I want to keep the existing database entry, and add another database entry for the additional language (if present), in this second case english is kept to its original database row, and italian is inserted as a new row
  5. It is made to work with bilingual sites, but in reality it will most definitely work fine with multilingual sites and you may even not need to edit anything; still, you are expected to have familiarity with PHP to run it with confidence (BACKUP DATABASE FIRST!!!!11!!!1oneone)

Following is the code.


ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);


function doquery($query) {
	if ($report=mysqli_error($GLOBALS["dbcon"])) {
	return $result;

function excerpt($string) {
	return substr($string, 0, 30)."...";


$GLOBALS["dbcon"]=@mysqli_connect($dbhost, $dbuser, $dbpass);
if (mysqli_error($GLOBALS["dbcon"])) $erroremysql=true;
if (mysqli_error($GLOBALS["dbcon"])) $erroremysql=true;
if (mysqli_error($GLOBALS["dbcon"])) $erroremysql=true;

$posts=doquery("SELECT * FROM wp_posts WHERE post_type='post'");


echo '<font face="Courier New">';

while ($post=mysqli_fetch_assoc($posts)) {
	echo "<strong>post {$post["ID"]}</strong>:<br/>";
	if (strpos($post["post_title"],"[:en]")!==false || strpos($post["post_title"],"[:it]")!==false) {
	else if (strpos($post["post_title"],"<!--:en-->")!==false || strpos($post["post_title"],"<!--:it-->")!==false) {
	foreach ($lang as $l) {
		if (preg_match('/'.$s[0].$l.$s[1].'([\s\S]+?)'.$s[0].$s[1].'/',$post["post_title"],$matches)) {
	if (count($data)>1) {
		foreach ($data as $k=>$v) {
			echo "$k: ".excerpt($v[0])." - ".excerpt(strip_tags($v[1])).(($k==$main)?" main":"")."<br/>";
			//it is the main language, just updates post stripping the other language
			if ($k==$main) {
				doquery("UPDATE wp_posts SET post_title='".mysqli_real_escape_string($dbcon,$v[0])."', post_content='".mysqli_real_escape_string($dbcon,$v[1])."' WHERE ID=".$post["ID"]);
				echo "\n";
			//it is not, so creates a new post copying over the rest of the data
			else {
					INSERT INTO wp_posts (
					VALUES (
				echo "\n";
	else  {
		echo "1: ".excerpt($data[key($data)][0])." - ".excerpt(strip_tags($data[key($data)][1]))." main"."<br/>";
		doquery("UPDATE wp_posts SET post_title='".mysqli_real_escape_string($dbcon,$data[key($data)][0])."', post_content='".mysqli_real_escape_string($dbcon,$data[key($data)][1])."' WHERE ID=".$post["ID"]);
		echo "\n";

This is what you need to add to .htaccess in the root of your public_html folder, obviously adapting it to your needs, and adding more, similar rows if you have additional languages:

RewriteRule ^it/([a-z0-9\-\_]+)/$ /it-$1/ [R=301,L]

In my case, it worked like a charm, even if not without some cold sweats.

PHP script to batch download from Wallpaperscraft website

I found myself interesting in renewing my wallpapers gallery, and Wallpaperscraft website is really full of themed collections, like, it’s huge!

But then, who’s going to just download every picture by hand, right? Well I know PHP, so one morning when I had the time I jotted down some lines of code.

This script is working at the time of writing, but any change occurring in the source code or in the url structure may break it (not that it would be so hard to fix it anyway).

This is clear to anyone who can run a PHP script, just change the configurable values at the top, the folder name “Wallpapers” in the code, and it’s good to go.



while ($goon) {
	echo "\nscarico da $webbase$i\n\n";
	if (strpos($html,"<html>")===false) {
		if (strpos($html,"<html>")===false) {
			echo "pagina $i sballata, riscarico...\n";
	preg_match_all("/<a class=\"wallpapers__link\" href=\"([\/a-z0-9\_]+)\">/",$html,$matches);
	if ($matches[1][0]) {
		foreach ($matches[1] as $image) {
			if (!file_exists("Wallpapers/$image.jpg")) {
				$handle=@fopen($imgbase.$image.$res.".jpg", 'rb');
				if ($handle) {
					echo "$i:$c $image ...\n";
	else {

Lettura di una fattura elettronica firmata xml p7m con PHP

Se il file di fattura elettronica che bisogna aprire è firmato digitalmente, e cioè ha una seconda estensione p7m, come avete notato il contenuto è tutto sommato leggibile ad occhio perché non viene oscurato, ma semplicemente “inquinato” dai blocchi della firma digitale.

Diventa però inutilizzabile da qualunque parser PHP, e richiede la estrazione del file XML dall’involucro firmato p7m.

Lista della spesa:

  • il programma openssl.exe scaricabile da questa pagina (vi serve lo zip dei file binari)
  • il file .xml.p7m

La sintassi del comando openssl è come segue:

openssl.exe smime -verify -noverify -in fattura.xml.p7m -inform DER -out fattura.xml

Quindi nel vostro bel codice PHP lancerete qualcosa tipo:

exec("percorso\openssl.exe smime -verify -noverify -in $fatturap7m -inform DER -out percorso\\".basename($fatturap7m,".p7m").".xml",$output);

Avrete quindi disponibile il file XML in chiaro, col quale potrete seguire quanto illustrato in questo articolo.

Dump SQLite database to .sql with PHP alone

I have a webapp on my phone what uses SQLite, and I have no sqlite3 tool that I can call from CLI, so I needed a pure-PHP solution to dumping my database.

Nothing on the internet seems to be created for this purpose, or at least nothing that can be found in the first page of google.

So well, if this page won’t end up on the first page of google I’m just wasting my time.


So anyway:


$db = new SQLite3(dirname(__FILE__)."/your/db.sqlite");


$tables=$db->query("SELECT name FROM sqlite_master WHERE type ='table' AND name NOT LIKE 'sqlite_%';");

while ($table=$tables->fetchArray(SQLITE3_NUM)) {
	$sql.=$db->querySingle("SELECT sql FROM sqlite_master WHERE name = '{$table[0]}'").";\n\n";
	$rows=$db->query("SELECT * FROM {$table[0]}");
	$sql.="INSERT INTO {$table[0]} (";
	$columns=$db->query("PRAGMA table_info({$table[0]})");
	while ($column=$columns->fetchArray(SQLITE3_ASSOC)) {
	$sql.=implode(",",$fieldnames).") VALUES";
	while ($row=$rows->fetchArray(SQLITE3_ASSOC)) {
		foreach ($row as $k=>$v) {

Or just find, edit, comment the code on GitHub.

Migrating PHP code from MySQL to SQLite, the basics

This is also for self-reference.

Data types


There’s a shitload of data types, really, just go look at the official documentation and bask in the glory of MySQL redundancy (j/k).


Very few, reassuring and simple data types. Basically text, numbers, and “whatever” (blob). Again, go look at the official documentation.

Database connection


$GLOBALS["dbcon"]=@mysqli_connect($dbhost, $dbuser, $dbpass);
if (mysqli_error($GLOBALS["dbcon"])) die("errore connessione db");
if (mysqli_error($GLOBALS["dbcon"])) die("errore connessione db");
if (mysqli_error($GLOBALS["dbcon"])) die("errore connessione db");


$db = new SQLite3(dirname(__FILE__)."/DB/db.sqlite");
// WAL mode has better control over concurrency.
// Source:
$db->exec('PRAGMA journal_mode = wal;');
$db->exec('PRAGMA synchronous=NORMAL;');

(last couple of rows are only useful if you plan to have some -little- write-concurrency, otherwise don’t use them)

Very important thing to know: if you are writing code for a local-running application, SQLite connections will not “time out” as there’s no server to wait for your input, just a file on the disk (or memory, even!)






when you don’t expect results, so for INSERT, UPDATE or DELETE


when you expect multiple results, or several fields in a row


when you want returned a single-value result, for example when the query is something like SELECT timestamp FROM records WHERE id=$number LIMIT 1 (for this, with MySQL, you should parse the results with mysqli_fetch_array or similar, and then select the first value with [0])

Fetch results



when you want both associative and indexed arrays,


when you only need associative arrays, and


if you want only indexed arrays.


In parallel with above:




Speed considerations

If you don’t need associative arrays, you should always go for indexed arrays, since both in MySQL and SQLite they are fetched significantly faster; also, even if by very little, fetching only associative arrays is still faster then having both associative and indexed fetched together (and you’re not going to need those both anyway).






this function is not binary safe though at the time of writing (hasn’t been for a while from what I understand…)

Database functions

Just converting the PHP functions won’t be sufficient for most.

Think about time functions for examples, or DEFAULT values, or NULLing a NOT NULL timestamp column to have it automatically assigned to CURRENT_TIMESTAMP, these things are not present in SQLite.



NOW(), YEAR(), TIMEDIFF(), and another shitload of functions.


DEFAULT (Datetime('now','localtime'))

Several variations on the strftime() functions, of which the Datetime() above is an example.

Again, go look at the official documentation, as what you see above is my own translation for my own purposes, and you will find both in official sources and in the vast world of StackOverflow a plethora of readings and interpretations.

Share mysql and apache folders from windows XAMPP to linux XAMPP

This is a replica of the article by jultech from 2007 for posterity and data-replication purposes.

When dualbooting between windows (first OS installed) and linux, do this:

~ # mv /opt/lampp/var/mysql /opt/lampp/var/mysql.BACKUP
 ~ # ln -s /mnt/windisk/path/to/XAMPP/mysql/data/ /opt/lampp/var/mysql
 ~ # mv /opt/lampp/htdocs /opt/lampp/htdocs.BACKUP
 ~ # ln -s /mnt/windisk/path/to/XAMPP/htdocs /opt/lampp/htdocs

This creates a backup of the original XAMPP folders in linux, and them symlinks them to the windows disk mounts.

Restart XAMPP to have th configuration running.

Thank you jultech.

Add custom social media share icon buttons in phpBB

Self reference article: how to add responsive social media share icons on your forum for each page.

HTML to add to /styles/stylename/template/over_header.html, right under <div id="page-header" class="page-width">

 <div id="socialshare" data-expendable="yes">
 <a class="twittersharebutton" href="{SOCIAL_URL}" target="_blank">t</a>
 <a class="facebooksharebutton" href="{SOCIAL_URL}" target="_blank">f</a>
 <a class="googlesharebutton" href="{SOCIAL_URL}" target="_blank">g</a>

CSS to add at the bottom of <head> section of same file:

 #socialshare {
a.twittersharebutton, a.facebooksharebutton, a.googlesharebutton {
 text-decoration: none;
a.twittersharebutton {
a.facebooksharebutton {
a.googlesharebutton {
 line-height:60% !important;
#socialshare a:hover {
@media (min-width: 1100px) {
 #socialshare {
#socialshare>div {
 width: 50px;
a.twittersharebutton, a.facebooksharebutton, a.googlesharebutton {
 #page-header {position:relative;}
@media (max-width:1099px) {
 #socialshare {
a.twittersharebutton, a.facebooksharebutton, a.googlesharebutton {

Code to add in /includes/functions.php, in the template variable assignment (look for LAST_VISIT_DATE)

'SOCIAL_URL' => urlencode(generate_board_url() . '/' . $user->page['page']),

Reload, all done!

Cache PHP to gzipped static html pages using htaccess redirect

No matter the server performance, the fastest kind of website for a visitor is the one with static HTML pages; this way the server just has to upload existing data to the browser instead of starting the PHP compiler, open a connection to MySQL, fetch data, format the page and only after that send it. Less CPU, less memory, less processing time, more users served in a smaller amount of time.

Avoiding PHP and MySQL execution altogether is the key, and my project was to create something similar to WPSuperCache to be implemented in a generic PHP website, so my thanks go to said plugin’s developers for the source code that gave me precious tips.

DISCLAIMER: this guide presumes you are “fluent” with PHP and .htaccess coding, and is meant to just give directions as of how to obtain a certain result. This guide will not give a pre-cooked solution to just copy/paste into your website, you need to change/add code to adapt it to your need; I am not the person who can help you if you need a tailored solution, simply because I have no time to give away free personalized help!

Here’s our logical scheme:

  1. Visitor comes to website and asks for page X;
  2. Webserver checks via .htaccess (avoiding PHP execution) if there is a static cached version of page X already, and in this case either serves the gzipped file (if the client supports it), or falls back to the uncompressed HTML version… at which point our scheme ends; otherwise, if no cached version exists, continue to next step;
  3. PHP builds the page and serves it to the visitor as “fresh”, but at the same time saves the HTML output as both gzipped and uncompressed version so the next visitor will directly download that one.

This is a very bare procedure, which needs to be perfected by the following conditions:

  • Not all pages need to be cached (those that by their very nature change very often, like real time statistics, a poll, whatever)… in fact, some pages must NOT be cached (those that only moderators can view, both for secutiry reasons and consistency); PHP must avoid caching those pages, so that htaccess will always serve the “fresh” PHP page.
  • Pages that do get cached still need to be refreshed, because sometimes they can change (articles or posts that are modified/updated with time, or where comments get posted).
  • Some pages, even if the corresponding cached version exists, must NOT be served from cache, for instance when POST data is submitted (sometimes, also with GET, you as the webmaster should know if this is the case), that requires PHP to be handled.

Define “caching behaviour” in PHP

This section is where you set the variables that the PHP caching routine will check against to know when and how to do its job.


And explained:

  • $cachepath is the folder where you want the static (HTML and gzip) files to be stored (the website I use this caching routine on, has all the pages accessible from root folder with rewrite URLs, in other words the only slash in the URL is the one after the domain name, and I have no need to reproduce any folder structure; if you do, you’re on your own);
  • $cachesuffix is a string I need to add to the URL string (for example if address is then in this case the cachefile will be named pagename-static), this is useful if you want to cache the homepage which is not named index.php but is just because in that case, the cachefile name would be empty and .htaccess won’t find it;
  • $nocachelist is an associative array where you have to add as many keys (pointing to a value of 1) as the pages you don’t want (for whatever reason) to cache; in the key name you have to put the string the user would write in the URL bar after the domain name slash to get to the page, for example if you don’t want to cache you would be using “statistics”=>1 in there, as already is.

Have PHP actually save to disk the cached pages

     if (
        !isset($nocachelist[$_GET["page"]]) &&
        !$_SESSION["admin"] &&
        !count($_POST) &&
    ) {
        //build the uncompressed cache
        if (!file_exists($cachepath.$url.$cachesuffix.".html")) {
        //build the compressed cache
        if (!file_exists($cachepath.$url.$cachesuffix.".html.gz")) {

Pretty much self explanatory, isn’t it.
No seriously, do you really want me to elaborate?

Ok, you got it.

  • The $_GET[“page”] is a GET value I set early in the code to know where we are in the website, you can use any variable here as long as you can check it against the $nocachelist array;
  • The other conditions in the first if should be clear, they avoid building the page’s cache if the CMS’s admin is logged (security) or if POST data is submitted or if there is a query string appended to the URL (consistency/stability);
  • $url is a variable that I define early in the code, and contains the string after the domain name slash and before the query string question mark, basically the kind of string you fill the $nocachelist array with (if you were paying attention, you may now think I have a redundant variable since $url and $_GET[“page”] should be the same, but this is not the case for other reasons);
  • $html is the string variable that, across the whole CMS, defines the raw HTML code to echo at the end of the PHP execution; you can either do like I do and define such string, or use an output buffer to obtain HTML if you instead print the HTML directly to screen during PHP execution;
  • ~creationtime is a “hotkey” I use in my template to plug in the time in seconds that was needed to create the page in PHP; since I am creating a cached version now, the creation time of the page is zero, because it’s already there to be downloaded from the browser instead of having to be compiled by the server, so in there I print either “cached&gzipped” for clients that support gzip, or only “cached” when the browser doesn’t not; you can safely strip out this part, as this is more of a eyecandy/nerdy/debug thing.

Let .htaccess send the cached files before starting the PHP compiler

AddEncoding x-gzip .gz
<FilesMatch "\.html\.gz$">
    ForceType text/html

RewriteCond %{REQUEST_URI} !/forum/
RewriteCond %{QUERY_STRING} !.*=.*
RewriteCond %{HTTP:Cookie} !^.*(isadmin|nocache).*$
RewriteCond %{HTTP:X-Wap-Profile} !^[a-z0-9\"]+ [NC]
RewriteCond %{HTTP:Profile} !^[a-z0-9\"]+ [NC]
RewriteCond %{HTTP:Accept-Encoding} gzip
RewriteCond /full/path/to/your/htdocs/cache/$1-static.html.gz -f
RewriteRule ^(.*) /cache/$1-static.html.gz [L,T=text/html]

RewriteCond %{REQUEST_URI} !/forum/
RewriteCond %{QUERY_STRING} !.*=.*
RewriteCond %{HTTP:Cookie} !^.*(isadmin|nocache).*$
RewriteCond %{HTTP:X-Wap-Profile} !^[a-z0-9\"]+ [NC]
RewriteCond %{HTTP:Profile} !^[a-z0-9\"]+ [NC]
RewriteCond /full/path/to/your/htdocs/cache/$1-static.html -f
RewriteRule ^(.*) /cache/$1-static.html [L]

This is the code you need to plug in .htaccess, preferably after everything else, but before defining the custom error pages; anyway, since you should be .htaccess-fluent, you shouldn’t need to be told where this fits best.

Some detailing for the curious:

  • First bit is needed (at least I needed it) to serve the gzipped files in a way that the browser knows to handle, otherwise I just got gibberish (the gzip was being sent to output without being uncompressed by the browser first);
  • if the cookies “isadmin” or  “nocache” exist, the cache version of the page will not be served even if it exists; easy explanation, if an admin is logged, and there is special content on a page that only admins can see, you don’t want them to see the “vanilla” cached version of the pages instead; so it’s your duty in this case to set a “isadmin” cookie when an admin logs in, and remove it when the admin logs out;
  • choosing the correct full path on the webserver was a bit tricky in my case, I can’t quite remember where the issue was, but depending on the method I used to get it I had different paths; I kind of remember I had to choose between $_SERVER[“DOCUMENT_ROOT”] and dirname(__FILE__) because only one was working with .htaccess;
  • you don’t really have to change much in this code snippet unless you have particular needs; the /forum/ path exclusion could be irrelevant in your case;
  • thanks go to WPSuperCache developers for their .htaccess code that I stole to build this snippet!

Last but not least

The cache is there to help you, not to give an handicap to your website. You choose what the cache performance should be, and how many times it should get served instead of the dynamic page, before considering it paied off, but you have to clear the cache from time to time to make sure you’re not serving outdated stuff to your visitors.

In my case I use an external cronjob provider ( to trigger a PHP routine every night which includes the following:

while (($file = readdir($handle))!==false) @unlink("cache/".$file);

so that everyday the website starts off with a fresh cache. Not less important, you should clear the cache of a single page as soon as you know that page changed, unless you’re ok with the changes being visibile only the next day after all the cache is cleared anyway. Example: you edit a page while logged as admin, or a user posts a comment, or anything happens that you have control over that alters in any way the page’s HTML: simply use unlink() to delete both the gzipped and uncompressed caches and the website will recreate them with the updated content.


Have fun with pimping up this draft!

Zip MySQL database backup with PHP and send as email attachment

I was a happy user of MySQLDumper until a while ago, when my shared hosting upgraded to the cloud architecture and latest PHP version; with that, some modules for PERL went missing and I was left without a viable backup solution (MySQLDumper PHP approach uses a javascript reloading routine which is superslow and is not compatible with wget nor remote cronjobs like

So I found a couple of tutorials on the net (this, and this), mixed them together, especially improving David Walsh’s routine, blended them with some healthy bzip2 compression, and shipped to my webspace.

I stripped the part where you can select which tables to backup; I figured that any backup deserving this name should be “all-inclusive”; also I think that anyone with the special need to backup only selective tables should also be smart enough to change the code to suit his/her needs.

UPDATE 9/9/11: I just added a snippet of code to blacklist some tables from the backup, since I needed to trim the size or it gave a fatal error about the available memory being full; just fill the array with rows as you see fit

This is the result:

$creationstart=strtok(microtime()," ")+strtok(" ");


$subject="Backup DB";
$from_name="Your trustworthy website";

mysql_connect($dbhost, $dbuser, $dbpass);

$tables = array();
$result = mysql_query("SHOW TABLES");
while($row = mysql_fetch_row($result))
$tables[] = $row[0];
foreach($tables as $table) {
if (!isset($tablesblocklist[$table])) {
$result = mysql_query("SELECT * FROM $table");
$return.= "DROP TABLE IF EXISTS $table;";
$row2 = mysql_fetch_row(mysql_query("SHOW CREATE TABLE $table"));
$return.= "\n\n".$row2[1].";\n\n";
while($row = mysql_fetch_row($result)) {
$return.= "INSERT INTO $table VALUES(";
foreach ($row as $field)
$return.= implode(",",$fields).");\n";
$filename='db-backup-'.date("Y-m-d H.m.i").'.sql.bz2';

"From: ".$from_name." <".$from_mail.">\r\n".
"Reply-To: ".$replyto."\r\n".
"MIME-Version: 1.0\r\n".
"Content-Type: multipart/mixed; boundary=\"".$uid."\"\r\n\r\n".
"This is a multi-part message in MIME format.\r\n".
"Content-type:text/plain; charset=iso-8859-1\r\n".
"Content-Transfer-Encoding: 7bit\r\n\r\n".
"Content-Type: application/octet-stream; name=\"".$filename."\"\r\n".
"Content-Transfer-Encoding: base64\r\n".
"Content-Disposition: attachment; filename=\"".$filename."\"\r\n\r\n".

$creationend=strtok(microtime()," ")+strtok(" ");
echo "Database backup compressed to bz2 and sent by email in $creationtime seconds";

(WordPress killed the indenting and I’ve no intention of fixing it manually)

The whole routine runs in roughly 7seconds on my not-so-bright hoster server, including bzip2’ing and mailing, while the original text-only output for David’s code took on the same server roughly 17seconds; probably it’s due to the removal of several redundant loops, and using the builtin mysql_real_escape_string() and implode() functions instead of David’s workarounds.

My sincere thanks go to the authors of the two guides, without whom I wouldn’t be writing this today 😉