Blog archives

49 posts in the archive

Auto-wiring for Zend ServiceManager

Writing factories for zend-servicemanager can be a tedious, repetitive task. Most of factories I write follow the same pattern: pull some dependencies from the container, instantiate new object and return it. How can you avoid the repetition?

Extreme caching with PSR-7

PSR-7 brought some interesting patterns that can be applied to PHP application regardless of what framework it uses. It is particularly interesting when it comes to performance - no matter what technology your project uses, you can apply the same techniques to make it faster.

Here I will show how PSR-7 middleware can be used to cache application's output. I call it "extreme caching", because I want to trigger it as early as possible, in order to reduce amount of code to be executed on each request.

I will present this pattern on Zend Expressive-based application. It will work for any PSR-7 framework that uses middleware with following signature (which has become de facto standard):

function ($request, $response, $next) { }

Speeding up PHP application bootstrap with Class Dumper

One reasons for PHP still being considered slow is a consequence of how it works under web server environment: every time a client sends request, application is initialized from scratch - it runs all the bootstrap code. Bootstrapping is repeated over and over again, for every connecting client.

While this is an obvious waste of resources, it is also very difficult to avoid without rewriting an application under different architecture. Is there anything that could be done to at least reduce impact of application bootstrap, without making any changes to actual application? As it turns out, there is.

PSR-7: HTTP Messages Today

PSR-7 is here and is big. Now, more than one month after it was voted, a lot of work has been put into projects supporting this standard. Even though we’re still at the beginning of this great journey, it is exciting to see what is already available, thanks to great work of PHP community.

Watching related projects since the inception of that standard, I will present packages that can serve as foundation for actual applications: HTTP message implementations, dispatchers and micro frameworks.

Testing ZF2 module services

There's an important question often rising when working on Zend Framework 2 module: should I test service factories? After all, they are usually trivial, they create some object and inject it with dependencies from ServiceManager. Having one test per factory seems to be an overkill.

Better to go one step back, and ask yourself a question: what exactly do you want to test?

MtMail: e-mail module for ZF2

I'm happy to present a ZF2 module that handles composing and sending e-mail messages.

Why another module? There are a few of them already available on ZF modules website. However, when I was looking for solution to use in my application, I quickly realized that most of them are either outdated, or they miss features I needed. That's why I decided to write my own.

My intention was to create something powerful, but still simple to use. You an customize e-mail headers, add layout, automatically generate plaintext version of HTML e-mail, and so on. But you can also start composing and sending e-emails from your controllers with just a few lines of code.

Using standalone Zend\View

Zend\View is pretty advanced rendering engine, with multiple useful features. It is working nicely within ZF2's MVC stack, where it is automatically configured for you. But how to use it without full MVC?

This can be useful in some situations: when building Your Own Microframework™, when creating an application based on ZF2 components, or (in my case) when working on module that is supposed to render something outside MVC flow. All of this projects can benefit from nested templates, multiple rendering engines, or pluggable architecture of Zend\View.

So, how to do that?

Extracting single table from huge MySQL dump

During last few weeks I had to work with relatively big MySQL dumps. I had to find interesting rows in about 400 files, each of them taking 40 minutes to import. In order to speed things up, I found simple tool that allowed me to extract only interesting tables.

The tool is actually single Perl script, named (available on Github). It allows extracting tables with simple command:

[email protected]:~$ -t TABLE_NAME -r DUMP_FILE.sql

This command will print dump to console output, so you may want to redirect it to some file:

[email protected]:~$ -t TABLE_NAME -r DUMP_FILE.sql > table_name.sql

Finally, is able to read input from stdin, so it is easy to extract and import single table from compressed dump file:

[email protected]:~$ zcat DUMP_FILE.sql | -t TABLE_NAME \
| mysql dest_database -u username -p

Automated MySQL backup on dedicated server or VPS

I just moved all my small projects to new dedicated server. I have to admit, until now I wasn't paying attention to regular backups. I simply ran mysqldump and copied everything to my laptop every few months. I didn't have any problems with that, as my data was not very critical. But, this time I decided to build something better - I wanted database backups to be generated automatically, at regular intervals.

I knew more or less what to do, I just had to put all pieces together. This tutorial shows necessary steps to build similar solution on your server.