Choosing Boring Technology

A wonderful presentation by Dan McKinley of Stripe on how technology companies make mistakes in choosing the next piece of shiny technology instead of relying on their existing and tested stack.

He outlines how this increases the overall cost and how this cripples the development team in many ways. I have to admit that I used to do this myself for many years and would go for the next framework, the next data store, etc.

I learned the lesson the hard way, and thankfully since then, I have chosen the fewer, the most known technology that serves the need.

Organizing Life

I am a procrastinator. One who has mastered its art. I have always been so. I do important things at the very last minute, with the highest intensity. I have succeeded with this approach more than I should have. As a result, I’ve escaped the failure narrowly all the time. However, it doesn’t give me any joy or satisfaction. It leaves a “I could have done better” feelings inside all the time. And its something that I am not proud of.

Well, enough with that. There has to be a stop to this and I have to make some serious changes to get out of this nasty habit. I’m reading some books and blog posts to understand it better. I’m also trying to develop a system that, if followed properly, will allow me to avoid this.

So far I am adopting the following changes in my life:

I am hopeful I can stick to these changes and get better in this. I’ll review this every two weeks and will make any adjustments that I see fit.

Photo by Kyle Szegedi.

Concurrent HTTP requests in PHP using pecl_http

The pecl_http extension has a little gem that can be handy at times – HttpRequestPool. Using this, you can send concurrent HTTP requests and can gain efficiency in fetching non-related data at once. For example, from an external source if your application needs to retrieve an user’s profile, their order history and current balance, you can send parallel requests to the API and get everything together.

Here is a simple example code to illustrate that:

1<?php
2 
3$endpoint = "http://api.someservice.com";
4$userId = 101;
5 
6$urls = array(
7 $endpoint . '/profile/' . $userId,
8 $endpoint . '/orderHistory/' . $userId,
9 $endpoint . '/currentBalance/' . $userId
10);
11 
12$pool = new HttpRequestPool;
13 
14foreach ($urls as $url) {
15 $req = new HttpRequest($url, HTTP_METH_GET);
16 $pool->attach($req);
17}
18 
19// send all the requests. control is back to the script once
20// all the requests are complete or timed out
21$pool->send();
22 
23foreach ($pool as $request) {
24 echo $request->getUrl(), PHP_EOL;
25 echo $request->getResponseBody(), PHP_EOL . PHP_EOL;
26}

Amazon S3 integration with Symfony2 and Gaufrette

Its amazing how Symfony2 has created an ecosystem where you can just add a bundle to your application and in minutes, you are able to make use of functionality/integration that would take you days to add if you had to code it up yourself. A while ago, I needed to add S3 CDN support to our Loosemonkies application so that the avatars uploaded by job seekers and company logos uploaded by employers would be stored in a globally accessible CDN. I started looking for a ready-to-use bundle that I can just add to the project, beef up a few config and everything else would just work. However, this time the task seemed a bit tough.

After much searching and evaluating, I stumbled upon this wonderful post. The author has shown here how to use Gaufrette and the Amazon AWS bundle together to achieve what I was looking for. It took me a while to follow the steps and finally I integrated it. Then I had a new requirement where I would need to upload assets that have already been sitting in the local, so I would add an additional method in the PhotoUploader class to handle it – uploadFromUrl. It would guess the mime type of the file by extension, as we do not have the mime type handed to us by PHP. It worked beautifully and we were all set.

Sharing the code here in case others find it useful.

1<?php
2 
3namespace LM\Bundle\CoreBundle\Controller;
4 
5use Symfony\Component\HttpFoundation\Request;
6use Symfony\Component\HttpFoundation\Response;
7use Symfony\Bundle\FrameworkBundle\Controller\Controller;
8 
9class AppController extends Controller
10{
11 /**
12 * Upload Image to S3
13 *
14 * @param string $name Image field name
15 * @param int $maxWidth Maximum thumb width
16 * @param int $maxHeight Maximum thumb height
17 *
18 * @return string
19 */
20 protected function uploadImage($name, $maxWidth = 100, $maxHeight = 100)
21 {
22 $image = $this->getRequest()->files->get($name);
23 
24 $uploader = $this->get('core_storage.photo_uploader');
25 $uploadedUrl = $uploader->upload($image);
26 
27 return $this->container->getParameter('amazon_s3_base_url') . $uploadedUrl;
28 }
29}
1{
2 "require": {
3 "knplabs/gaufrette": "dev-master",
4 "knplabs/knp-gaufrette-bundle": "dev-master",
5 "amazonwebservices/aws-sdk-for-php": "dev-master"
6 }
7}
1imports:
2 - { resource: parameters.yml }
3 - { resource: security.yml }
4 - { resource: @LMCoreBundle/Resources/config/services.yml }
5
6knp_gaufrette:
7 adapters:
8 photo_storage:
9 amazon_s3:
10 amazon_s3_id: loosemonkies_core.amazon_s3
11 bucket_name: %amazon_s3_bucket_name%
12 create: false
13 options:
14 create: true
15 filesystems:
16 photo_storage:
17 adapter: photo_storage
18 alias: photo_storage_filesystem
19
20loosemonkies_core:
21 amazon_s3:
22 aws_key: %amazon_aws_key%
23 aws_secret_key: %amazon_aws_secret_key%
24 base_url: %amazon_s3_base_url%
1<?php
2 
3namespace LM\Bundle\CoreBundle\DependencyInjection;
4 
5use Symfony\Component\Config\Definition\Builder\TreeBuilder;
6use Symfony\Component\Config\Definition\ConfigurationInterface;
7 
8/**
9 * This is the class that validates and merges configuration from your app/config files
10 *
11 * To learn more see {@link http://symfony.com/doc/current/cookbook/bundles/extension.html#cookbook-bundles-extension-config-class}
12 */
13class Configuration implements ConfigurationInterface
14{
15 /**
16 * {@inheritDoc}
17 */
18 public function getConfigTreeBuilder()
19 {
20 $treeBuilder = new TreeBuilder();
21 $rootNode = $treeBuilder->root('loosemonkies_core');
22 
23 // Here you should define the parameters that are allowed to
24 // configure your bundle. See the documentation linked above for
25 // more information on that topic.
26 
27 $rootNode
28 ->children()
29 ->arrayNode('amazon_s3')
30 ->children()
31 ->scalarNode('aws_key')->end()
32 ->scalarNode('aws_secret_key')->end()
33 ->scalarNode('base_url')->end()
34 ->end()
35 ->end()
36 ->end();
37 
38 return $treeBuilder;
39 }
40}
1# This file is auto-generated during the composer install
2parameters:
3 locale: en
4 secret: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
5 amazon_aws_key: XXXXXXXXXXXXXXXXXXXX
6 amazon_aws_secret_key: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX+
7 amazon_s3_bucket_name: dev-assets
8 amazon_s3_base_url: 'http://s3.amazonaws.com/dev-assets/'
1<?php
2 
3namespace LM\Bundle\CoreBundle\Service;
4 
5use Symfony\Component\HttpFoundation\File\UploadedFile;
6use Gaufrette\Filesystem;
7 
8class PhotoUploader
9{
10 private static $allowedMimeTypes = array('image/jpeg', 'image/png', 'image/gif');
11 private $filesystem;
12 
13 public function __construct(Filesystem $filesystem)
14 {
15 $this->filesystem = $filesystem;
16 }
17 
18 public function upload(UploadedFile $file)
19 {
20 // Check if the file's mime type is in the list of allowed mime types.
21 if (!in_array($file->getClientMimeType(), self::$allowedMimeTypes)) {
22 throw new \InvalidArgumentException(sprintf('Files of type %s are not allowed.', $file->getClientMimeType()));
23 }
24 
25 // Generate a unique filename based on the date and add file extension of the uploaded file
26 $filename = sprintf('%s/%s/%s/%s.%s', date('Y'), date('m'), date('d'), uniqid(), $file->getClientOriginalExtension());
27 
28 $adapter = $this->filesystem->getAdapter();
29 $adapter->setMetadata($filename, array('contentType' => $file->getClientMimeType()));
30 $adapter->write($filename, file_get_contents($file->getPathname()));
31 
32 return $filename;
33 }
34 
35 public function uploadFromUrl($url)
36 {
37 // Get file extension
38 $extension = pathinfo($url, PATHINFO_EXTENSION);
39 
40 // Generate a unique filename based on the date and add file extension of the uploaded file
41 $filename = sprintf('%s/%s/%s/%s.%s', date('Y'), date('m'), date('d'), uniqid(), $extension);
42 
43 // Guess mime type
44 $mimeType = $this->guessMimeType($extension);
45 
46 $adapter = $this->filesystem->getAdapter();
47 $adapter->setMetadata($filename, array('contentType' => $mimeType));
48 $adapter->write($filename, file_get_contents($url));
49 
50 return $filename;
51 }
52 
53 private function guessMimeType($extension)
54 {
55 $mimeTypes = array(
56 
57 'txt' => 'text/plain',
58 'htm' => 'text/html',
59 'html' => 'text/html',
60 'php' => 'text/html',
61 'css' => 'text/css',
62 'js' => 'application/javascript',
63 'json' => 'application/json',
64 'xml' => 'application/xml',
65 'swf' => 'application/x-shockwave-flash',
66 'flv' => 'video/x-flv',
67 
68 // images
69 'png' => 'image/png',
70 'jpe' => 'image/jpeg',
71 'jpeg' => 'image/jpeg',
72 'jpg' => 'image/jpeg',
73 'gif' => 'image/gif',
74 'bmp' => 'image/bmp',
75 'ico' => 'image/vnd.microsoft.icon',
76 'tiff' => 'image/tiff',
77 'tif' => 'image/tiff',
78 'svg' => 'image/svg+xml',
79 'svgz' => 'image/svg+xml',
80 
81 // archives
82 'zip' => 'application/zip',
83 'rar' => 'application/x-rar-compressed',
84 'exe' => 'application/x-msdownload',
85 'msi' => 'application/x-msdownload',
86 'cab' => 'application/vnd.ms-cab-compressed',
87 
88 // audio/video
89 'mp3' => 'audio/mpeg',
90 'qt' => 'video/quicktime',
91 'mov' => 'video/quicktime',
92 
93 // adobe
94 'pdf' => 'application/pdf',
95 'psd' => 'image/vnd.adobe.photoshop',
96 'ai' => 'application/postscript',
97 'eps' => 'application/postscript',
98 'ps' => 'application/postscript',
99 
100 // ms office
101 'doc' => 'application/msword',
102 'rtf' => 'application/rtf',
103 'xls' => 'application/vnd.ms-excel',
104 'ppt' => 'application/vnd.ms-powerpoint',
105 'docx' => 'application/msword',
106 'xlsx' => 'application/vnd.ms-excel',
107 'pptx' => 'application/vnd.ms-powerpoint',
108 
109 // open office
110 'odt' => 'application/vnd.oasis.opendocument.text',
111 'ods' => 'application/vnd.oasis.opendocument.spreadsheet',
112 );
113 
114 if (array_key_exists($extension, $mimeTypes)){
115 return $mimeTypes[$extension];
116 } else {
117 return 'application/octet-stream';
118 }
119 
120 }
121 
122}
1parameters:
2
3 core.amazon_s3.class: AmazonS3
4 core_storage.photo_uploader.class: LM\Bundle\CoreBundle\Service\PhotoUploader
5
6services:
7
8 loosemonkies_core.amazon_s3:
9 class: %core.amazon_s3.class%
10 arguments:
11 - { key: %amazon_aws_key%, secret: %amazon_aws_secret_key% }
12
13 core_storage.photo_uploader:
14 class: %core_storage.photo_uploader.class%
15 arguments: [@photo_storage_filesystem]

Optimizing variables cache in Drupal 6

In Drupal 6, a number of caching strategies are incorporated to handle large traffic. One of them is the serialization of the whole variable table. It is being cached in the database and gets extracted into global $conf variable in each invoke.

In one of our production sites, we faced hard time to keep up with the memory requirement of PHP for the unserialization of this variable from the cache. The variables table was so large that we had to assign around 1GB memory to each PHP thread so that the value can be unserialized without memory exhaustion. This made it much harder to scale the application.

So, we decided to do something about it and successfully handled it by doing the following:

1. First of all, we installed the memcache module to move the cache storage from DB to memory

2. We then edited the memcache module’s cache_get and cache_set functions to store/retrieve individual items from the variables array and split/join them when requested.

3. This requires a memcache call for each of the items in the variable array, but consumes a small amount of memory as there is no huge unserialize operation going on.

4. We performed a few tests to see if the site is working as before, and found its working perfectly!

Here is the code in case you are facing similar issue:

/sites/all/modules/contrib/memcache/memcache.inc

1&lt;?php
2
3// ...beginning part of the file
4
5function cache_set($cid, $data, $table = 'cache', $expire = CACHE_PERMANENT, $headers = NULL) {
6
7 // Handle database fallback first.
8 $bins = variable_get('memcache_bins', array());
9 if (!is_null($table) &amp;&amp; isset($bins[$table]) &amp;&amp; $bins[$table] == 'database') {
10 return _cache_set($cid, $data, $table, $expire, $headers);
11 }
12
13 // In case of special cache items, we keep the individual items as
14 // separate cache items. Later in the retrieval time, we join them together.
15 if (memcache_is_special_cache_item($cid)) {
16
17 $keys = array_keys($data);
18 foreach ($keys as $key) {
19 cache_set($cid . '_' . $key, $data[$key]);
20 }
21
22 cache_set($cid . '_keys', $keys);
23 return true;
24
25 }
26
27 // ...remaining part of the function
28}
29
30function cache_get($cid, $table = 'cache') {
31
32 // Handle excluded bins first.
33 $bins = variable_get('memcache_bins', array());
34 if (!is_null($table) &amp;&amp; isset($bins[$table]) &amp;&amp; $bins[$table] == 'database') {
35 return _cache_get($cid, $table);
36 }
37
38 // The special cache item was previously saved as individual items,
39 // so now we have to retrieve them separately and join them together
40 // and send as one item.
41 if (memcache_is_special_cache_item($cid)) {
42
43 $keys = cache_get($cid . '_keys');
44 if (is_null($keys-&gt;data)) {
45 return false;
46 }
47
48 $data = array();
49 foreach ($keys-&gt;data as $key) {
50 $data[$key] = cache_get($cid . '_' . $key);
51 }
52
53 $cache = new stdClass();
54 $cache-&gt;data = $data;
55
56 return $cache;
57 }
58
59 // ...remaining part of the function
60}
61
62function memcache_is_special_cache_item($cid) {
63 $specials = array('variables', 'strongarm');
64 return in_array($cid, $specials);
65}
66
67// ...remaining part of the file