Few people know about my ability to transform my feet into roller blades. Doctors around the world are perplexed, and also impressed. Biologists debate whether such a transformation is even possible.
I casually slide into the biologist debating hall and take a seat near the front. People are staring at my roller-feet, but I don’t care. I put my shades on. Continue reading
There are a two main things that tripped me up while I was writing functional tests for my Laravel controllers: POST requests, and session state.
Laravel’s Controller class has the
call() method, which essentially makes a GET request to a controller method. In order to make POST requests, it’s necessary to inject some extra parameters into the HttpFoundation components. To make this easier, I created a ControllerTestCase class with convenient
post() methods: Continue reading
Bcrypt is a Blowfish-based hashing algorithm which is commonly used for password hashing because of its potentially expensive key setup phase. A Bcrypt hash has the following structure:
$2a$(2 chars work)$(22 chars salt)(31 chars hash)
The reason that the key setup phase can be potentially expensive is because it is run
2work times. As password hashing is usually associated with common tasks like logging a user into a system, it’s important to find the right balance between security and performance. Using a high work factor makes it incredibly difficult to execute a brute-force attack, but can put unnecessary load on the system.
Using Marco Arment’s PHP Bcrypt class, I performed some benchmarks to determine how long it takes to hash a string with various work factors: Continue reading
It’s really easy to set up automatic MySQL backups using
mysqldump. First, you need to set up a user with
LOCK TABLES privileges. In this example the user doesn’t have a password.
CREATE USER 'autobackup'@'localhost';
GRANT SELECT, LOCK TABLES ON *.* TO 'autobackup'@'localhost';
Next create the cron job with
crontab -e. This job is set to run every day at 5:20am.
20 5 * * * mysqldump --user=autobackup dbname | gzip -c > /var/backups/dbname-`/bin/date +\%Y\%m\%d`.sql.gz
Don’t forget to change dbname to the name of the database that you want to backup. And that’s it – you’re done! This cron job will create a backup of your database and save it to
/var/backups with a filename based on the current date, e.g.
MySQL has a prefix limitation of 767 bytes in InnoDB, and 1000 bytes in MyISAM. This has never been a problem for me, until I started using UTF-16 as the character set for one of my databases. UTF-16 can use up to 4 bytes per character which means that in an InnoDB table, you can’t have any keys longer than 191 characters. Take this
CREATE statement for example:
CREATE TABLE `user` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`username` varchar(32) NOT NULL,
`password` varchar(64) NOT NULL,
`email` varchar(255) NOT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY `UNIQ_8D93D649F85E0677` (`username`),
UNIQUE KEY `UNIQ_8D93D649E7927C74` (`email`)
) ENGINE=InnoDB DEFAULT CHARSET=utf16 AUTO_INCREMENT=1 ;
This will fail with an error like
Specified key was too long; max key length is 767 bytes, because the
UNIQUE INDEX on the email field requires at least 1020 bytes (255 * 4).
Unfortunately there is no real solution to this. Your only options are to either reduce the size of the column, use a different character set (like UTF-8), or use a different engine (like MyISAM). In this case I switched the character set to UTF-8 which raised the maximum key length to 255 characters.
There are often times when you want to modify a file but not commit the changes, for example changing the database configuration to run on your local machine.
Adding the file to .gitignore doesn’t work, because the file is already tracked. Luckily, Git will allow you to manually “ignore” changes to a file or directory:
git update-index --assume-unchanged <file>
And if you want to start tracking changes again, you can undo the previous command using:
git update-index --no-assume-unchanged <file>
Today I found out just how easy it is to convert an SVN repository to Git without losing any commit history. Note that you will need git-svn (
apt-get install git-svn on Debian/Ubuntu).
git svn clone http://mysvnrepo.com/my-project my-project
git remote add origin firstname.lastname@example.org:/my-project.git
git push origin master
Et voilà, my-project.git has the full commit history of the my-project SVN repository.
If anybody knows whether SVN branches can be converted to Git branches, please get in touch!
Unlike Doctrine 1 with it’s NestedSet behaviour, there is no nested set functionality in the core of Doctrine 2. There are a few extensions available that offer nested set support:
I tried all of these extensions, but none of them felt simple or lightweight enough for my application. What I wanted to do was have a Category entity which could have a tree of sub-categories, e.g: Continue reading
Everybody wants to write “good code”, right? So why is it that nearly every time we pick up another developer’s work, our WTF-o-meter goes crazy?
Everybody has a different idea of what “good code” is. Below are a few ways that I believe we can increase the quality of our code and reduce the number of WTFs our code generates.
- Keep it simple; refactor overly-complex methods…
- …Or if refactoring isn’t feasible, document complex methods.
- Use descriptive variable and method names.
- Follow code conventions.
- Don’t commit unfinished or broken code.
Most of these are just common sense. The trouble is, we throw good coding practices – and common sense – out the window when we’re under pressure from things like slipping deadlines and scope creep. If you ever find this happening, just remember to write your code as if the person who has to maintain it is a violent psychopath who knows where you live. What would you rather: miss a deadline, or be hacked up into little pieces by an angry developer?
I came across this recently while I was developing a module for PyroCMS. Some of the PyroCMS tables contain ENUM columns, which Doctrine doesn’t support. You would think that this wouldn’t be an issue since these tables are not mapped, but apparently when Doctrine builds the schema it includes all tables in the database – even if they are not mapped. This has been reported as an issue, but the Doctrine team has given it a low priority.
The symptom? When using the SchemaTool to create, update, or drop the schema; an exception is thrown:
Fatal error: Uncaught exception 'Doctrine\DBAL\DBALException' with message 'Unknown database type enum requested, Doctrine\DBAL\Platforms\MySqlPlatform may not support it.'
Thankfully, the fix is very easy. There is even a Doctrine Cookbook article about it. All you have to do is register the ENUM type as a Doctrine varchar (string):
/** @var $em \Doctrine\ORM\EntityManager */
$platform = $em->getConnection()->getDatabasePlatform();
This fix can be applied to any unsupported data type, for example SET (which is also used in PyroCMS):