Monthly Archives: November 2014

Two great books about JavaScript

Recently I read two great books about JavaScript: JavaScript: The Good Parts and Effective JavaScript: 68 Specific Ways to Harness the Power of JavaScript. Both are not for beginners, but rather programmers who know some Object Oriented language (e.g., C#, Java or C++).

JavaScript: The Good Parts

JavaScript: The Good Parts - cover
Written by the “God Father of JavaScript”: Douglas Crockford. This book is an overview of JavaScript language features that are different than in Object Oriented programming languages, such as C# or Java. Additionally, there is an overview of the good, and the bad parts of the language. What to avoid, and how. You can read it in a week (~200 pages).

In addition to JavaScript Good Parts, I recommend you to watch Crockford on JavaScript (mirror). You will learn not only about JavaScript, but also about the history of programming languages. There is also JavaScript the Good Parts course by Douglas Crockford on Pluralsight. This course is focused only on JavaScript though.

Effective JavaScript: 68 Specific Ways to Harness the Power of JavaScript

Effective JavaScript - cover
I like this book even better than “The Good Parts”. It contains very good examples that show specific aspects of the language. These examples combined with associated explanations makes this book clear and very informative for me. What I like especially was the structure: short 68 problems with explanation, analysis, and examples. Also, not very long (~200 pages). If you can choose only one of the books described in this post, read this one.

Other JavaScript books

I would be very happy to hear from you other JavaScript books that are worth (or not) reading. So far, besides mentioned books, I heard good things about:

Can you recommend some of them? Share your opinions in comments.

WordPress on Azure: Exceeded ClearDB size = lock on INSERT/UPDATE (not able to log in to the admin panel)

My WordPress blog is hosted on Windows Azure, and I am using the only MySQL provider that is available on Azure: ClearDB.

Yesterday I couldn’t log in to the admin panel. I had no idea what was going on, because blog was working. I was googling for cause/solution, checking Azure logs, monitoring on Azure Portal, and accidentally I noticed that I exceeded ClearDB quota (20 MB). I did not receive any notifications from ClearDB though. What is important: if you exceed this limit, they block INSERT and UPDATE operations. My guess is that WordPress is probably trying to INSERT/UPDATE something in database when you log in. That’s why I couldn’t log in.

I did not want to upgrade from free instance to $9.99/month (the cheapest upgrade option). Fortunately I was able to connect with database using MySQL Workbench, and optimize my database.

I removed post revisions:

DELETE FROM wp_posts WHERE post_type = "revision";

And transients:

DELETE FROM wp_options WHERE option_name LIKE ('%\_transient\_%')

This allowed me to save a lot of space. From 20.28 MB, the database size went to 10.42 MB (transients occupied almost 8MB!):


ClearDB quota

After I did that, I was able to log in. However, INSERT/UPDATE lock is not revoked immediately. I had to wait something between 10 minutes and 2 hours. I went to the swimming pool in meantime, thus I am not sure how much exactly it take.

Useful SQL command to check you database size:

SELECT SUM(round(((data_length + index_length) / 1024 / 1024), 2)) "Size in MB"
FROM information_schema.TABLES
WHERE table_schema = '$DB_NAME'
ORDER BY (data_length + index_length) DESC;

You can also check each table size:

SELECT table_name AS "Tables",
round(((data_length + index_length) / 1024 / 1024), 2) "Size in MB"
FROM information_schema.TABLES
WHERE table_schema = '$DB_NAME'
ORDER BY (data_length + index_length) DESC;

For the future, I pinned the database size tile to my Azure Portal dashboard. Now, I will be able to see it every time I am visiting the portal. I also limited the number of post/pages revisions to 2, by editing wp-config.php file, and inserting this line:

define('WP_POST_REVISIONS', 2);

This should be enough to not exceed the quota for some time, but I will need some permanent solution. I am thinking about hosting my own MySQL database on LinuxVM, on Azure (cost: $13+).

During the troubleshooting I found very good blog post by John Papa: Tips for WordPress on Azure. I recommend you to check this out if you have a WordPress blog on Azure. This article will help you to optimize your WordPress database as well.

EDIT: The plugin Optimize Database after Deleting Revisions allow to clean up database even more efficient. I managed to slim my DB down by another 50%, to 4.11., which gives almost 80% size decrease from the original 20+ MB.

ClearDB quota with plugin

What is even more cool about this plugin, you can create a schedule to run it automatically (Settings -> Optimize DB options):

Optimize Database after Deleting Revisions - Options

Gulp – tutorial


Gulp is a streaming build system (aka task runner). It contains plugins, which allows you to run tasks such as TypeScript to JavaScript compilation, Less to CSS compilation, bundling, minification, running you own scripts, and much, much more.


Required: npm (Node Packaged Modules). You can install it with Chocolatey:

cinst nodejs

When you have npm, you are good to go and install Gulp:

npm install -g gulp

After installation, localize “gulp.cmd” file, and add its location to your PATH. Location depends on the way how you installed nodejs. On my one machine, where I installed nodejs using the installer, it is in C:\Users\jjed\AppData\Roaming\npm directory. On my other machine, where I installed nodejs from Chocolatey, it is in C:\ProgramData\chocolatey\lib\nodejs.commandline.0.10.31\tools. I recommend you Everything Search Engine not only for this task, but for searching all kind of files on your machine. It is ridiculously fast, and when you pin it to your task bar as the first item, you can invoke it with WIN+1.

I wish npm installation did this work (adding gulp to PATH) automatically.

Gulp Hello World

Open the Console (I recommend ConEmu as you Console on Windows), and cd into your project directory (for now it can be even empty directory).

From your project root directory run:

npm install --save-dev gulp

Create gulpfile.js file in the root of your project:

var gulp = require('gulp');

gulp.task('default', function () {
  console.log("Hello, world!");

Run gulp:


You should see the following result (with different time stamps and path, probably):

[21:45:32] Using gulpfile C:\dev\myproj\gulpfile.js
[21:45:32] Starting 'default'...
Hello, world!
[21:45:32] Finished 'default' after 141 µs

The default task is invoked when you run gulp.js without any arguments. You can run other tasks from default task. Below file contains hello task, and invoke it from the default task:

var gulp = require('gulp');

gulp.task('hello', function () {
  console.log("Hello, world!");

gulp.task('default', ['hello']);

The result of gulp run is a little bit different now:

[21:49:21] Using gulpfile C:\dev\myproj\gulpfile.js
[21:49:21] Starting 'hello'...
Hello, world!
[21:49:21] Finished 'hello' after 210 µs
[21:49:21] Starting 'default'...
[21:49:21] Finished 'default' after 8.21 µs

Using Gulp to compile TypeScript files ‘on change’

Install TypeScript (if you do not have it installed already):

npm install -g typescript

Make sure tsc is in your path.

Install TypeScript compiler for gulp.js (from your project root directory):

npm install --save-dev gulp-tsc

Create TypeScript directory, and cd into it:

mkdir TypeScript
cd TypeScript

Create simple hello.ts file:

function greeter(person: string) {
    return "Hello, " + person;

var user = "Anders Hejlsberg";


Go to root directory of your project and create js directory (it will be output directory for generated JavaScript files):

mkdir js

Modify gulpfile.js:

var gulp = require('gulp'),
	typescript = require('gulp-tsc');

gulp.task('typescript-compile', function(){
  return gulp.src(['TypeScript/*.ts'])

gulp.task('watch', function () {'TypeScript/*.ts', ['typescript-compile']);

gulp.task('default', ['typescript-compile', 'watch']);

Once you run gulp it will compile all TypeScript files from TypeScript directory to js directory. Furthermore, Gulp will not suspend, but it will watch all .ts files under TypeScript directory, and will recompile them every time you change and save some of them. Try it!

Prevent gulp.js from crashing by handling errors

You can notice that if some of your TypeScript files cause compilation error (e.g., because of syntax error) then gulp crashes.

In order to avoid it, you can add simple error handling function (errorLog) to gulp.js file, and call it in the typescript-compile task pipeline:

var gulp = require('gulp'),
	typescript = require('gulp-tsc');

function errorLog (error) {

gulp.task('typescript-compile', function(){
  return gulp.src(['TypeScript/*.ts'])
    .on('error', errorLog)

gulp.task('watch', function () {'TypeScript/*.ts', ['typescript-compile']);

gulp.task('default', ['typescript-compile', 'watch']);

Now, Gulp will not crash, but display compilation error, and recompile files after next file change.

Using gulp.js to run batch/bash script

Let assume that you want to run some script (batch file on Windows, or bash script on Unix) after TypeScript files compilation.

First, we need to install gulp-shell plugin:

npm install gulp-shell

To keep it simple, our script will only output text “TypeScript compilation done” to the console.

Create file build.cmd in your project root directory:

echo "TypeScript compilation done"

And modify gulpfile.js:

var gulp = require('gulp'),
  run = require('gulp-shell'),
  typescript = require('gulp-tsc');

function errorLog (error) {

gulp.task('typescript-compile', function(){
  return gulp.src(['TypeScript/*.ts'])
    .on('error', errorLog)

gulp.task('watch', function () {'TypeScript/*.ts', ['typescript-compile']);

gulp.task('default', ['typescript-compile', 'watch']);

In real-life you may want to copy all compiled files to some other directory, or do some other “your project specific” task.

Using gulp-shell you can run any command, or set of commands (concatenating them with ‘&‘ on Windows, or ‘&&‘ on Unix), e.g.:

shell(['cd c:/dev/myproj/src & build']);


Gulp is very handful tool. However it is not the only one. Its main rival is Grunt.js. Actually, Gulp was inspired by Grunt. Most of people (based on my research) prefer Gulp over Grunt. Main argument: it allows to stream results (run multiple operations on set of files), and it is less verbose (allows to do more with less configuration). Check Grunt vs Gulp comparison in this blog post, and this Steve Sanderson’s talk (starting from 30:50).

To get more of Gulp check Getting Started With Gulp, Learning Gulp, and gulp documentation.

Are you using Gulp or Grunt, or some other task runner?

EDIT: I updated the post with return statements in tasks. This informs task system that particular task is async, and prevents from occasional crashes when called from watch task.

Running the greatest VM on Azure

Recently Microsoft Azure introduced the New D-Series Virtual Machine Sizes. The “greatest” available VM has 16 cores, and 112 GB RAM. In my imagination it looks like that:

super PC

I thought it would be cool to create one, and play with it for a while. Not for a month, because that would cost almost $1000 (~700-800 EURO):

Azure VMs pricing

However, what is cool about Azure – you can scale VM down when you are not using it. Even to the cheapest option – A0 Basic (~10 EURO / month). And using D14 for an hour cost only ~1 Euro.

When I was wondering which OS install on the VM I found that Azure already offers Windows 10 preview VM:

Azure Windows 10 VM


This is how it looks after installation:

Huge VM

Working on this VM was even better (faster) than on my PC at work (Xeon with 6 cores and 32GB RAM). To stress the VM I opened over 100 instances of Visual Studio:

100 Visual Studios on Azure VM

After opening 90 instances the VM slowed down. I opened 103 Visual Studios in total, and VM didn’t crash.

This feeling of having the most powerful machine I have ever work on is amazing. Even though it is virtual. The most amazing thing is the fact that it cost me only 1 Euro to play with it for an hour. I can get it in a few minutes, and get rid of it within seconds.

I am using it from time to time as my playground, and scale-up/down according to my needs.

Later in this year, the G-series of VMs will be available on Azure. The biggest in this series would be G5: 32 Cores + 448 GB RAM. That’s gonna be…awesome!