This article is part of the article series "Node.JS Modules You Should Know About."
<- previous article next article ->

node logoHello everyone! This is the fifth post in my new node.js modules you should know about article series.

The first post was about dnode - the freestyle rpc library for node, the second was about optimist - the lightweight options parser for node, the third was about lazy - lazy lists for node, the fourth was about request - the swiss army knife of HTTP streaming.

This time I'll introduce you to hashish. Hashish is written by James Halliday, who's my co-founder of Browserling and Testling. In case you're wondering why I am blogging about so many of his modules, it's because he's written 88 of them and each one of them is absolutely brilliant.

Hashish is a JavaScript hash combinator library, or in other words, it contains a bunch of hash data structure manipulation functions.

Check out this example,

var Hash = require('hashish');

Hash({ a : 1, b : 2, c : 3, d : 4 })
    .map(function (x) { return x * 10 })
    .filter(function (x) { return x < 30 })
    .forEach(function (x, key) {
        console.log(key + ' => ' + x);
    })
;

Here a Hash object is constructed from the hash { a : 1, b : 2, c : 3, d : 4 }. Next, a function that multiplies each hash value by 10 is mapped over. At this moment the hash has become { a : 10, b : 20, c : 30, d : 40 } Then a filter is applied that filters only hash elements that have value less than 30. At this point hash is { a : 10, b : 20 }. Finally forEach combinator is applied all the elements that are left and the key, value pair is printed, producing the following output:

a => 10
b => 20

Notice how similar the interface for hash manipulation is to the node-lazy that I wrote about a few days ago. All the combinators can be chained so your code stays beautiful.

If you can't or don't want to chain the functions, hashish also allows each function in the chainable interface to be attached to Hash in chainless form:

var Hash = require('hashish');
var obj = { a : 1, b : 2, c : 3, d : 4 };

var mapped = Hash.map(obj, function (x) {
    return x * 10
});

console.dir(mapped);

Notice how this code calls Hash.map on obj hash. The output is each hash value multiplied by 10:

{ a: 10, b: 20, c: 30, d: 40 }

Hashish also provides various attributes in the chaining interface and functions in the Hash.xxx interface. For example:

$ node
> var Hash = require('hashish');
> var obj = { a : 1, b : 2, c : 3, d : 4 };
>
> Hash(obj).keys
[ 'a', 'b', 'c', 'd' ]
> Hash(obj).values
[ 1, 2, 3, 4 ]
> Hash(obj).length
4

You can install hashish through npm:

npm install hashish

Sponsor this blog series!

Doing a node.js company and want your ad to appear in the series? The ad will go out to 14,000 rss subscribers, 7,000 email subscribers, and it will get viewed by thousands of my blog visitors! Email me and we'll set it up!

Enjoy!

If you love these articles, subscribe to my blog for more, follow me on Twitter to find about my adventures, and watch me produce code on GitHub!

This article is part of the article series "Node.JS Modules You Should Know About."
<- previous article next article ->

node logoHey everyone! This is the fourth post in my new node.js modules you should know about article series.

The first post was about dnode - the freestyle rpc library for node, the second was about optimist - the lightweight options parser for node, the third was about lazy - lazy lists for node.

This time I'll introduce you to a very awesome module called request by Mikeal Rogers. Request is the swiss army knife of HTTP streaming.

Check this out:

var fs = require('fs')
var request = require('request');

request('http://google.com/doodle.png').pipe(fs.createWriteStream('doodle.png'))

Pow! You just streamed the response of HTTP request to http://google.com/doodle.png into doodle.png local file!

Here is more awesome stuff:

var fs = require('fs')
var request = require('request');

fs.readStream('file.json').pipe(request.put('http://mysite.com/obj.json'))

Pow! It streamed your local file file.json to http://mysite.com/obj.json as HTTP PUT request!

var request = require('request');

request.get('http://google.com/img.png').pipe(request.put('http://mysite.com/img.png'))

Pow! This just streamed a HTTP GET from http://google.com/img.png to HTTP PUT to http://mysite.com/img.png.

At Browserling we use this module for streaming data to and from couchdb. Here is an example that saves a JSON document at mikeal's test couchdb:

var request = require('request')
var rand = Math.floor(Math.random()*100000000).toString()

request({
  method: 'PUT',
  uri: 'http://mikeal.iriscouch.com/testjs/' + rand,
  multipart: [
    {
      'content-type': 'application/json',
      'body': JSON.stringify({
        foo: 'bar',
        _attachments: {
          'message.txt': {
            follows: true,
            length: 18,
            'content_type': 'text/plain'
           }
         }
       })
    },
    { body: 'I am an attachment' }
  ] 
}, function (error, response, body) {
  if(response.statusCode == 201){
    console.log('document saved as: http://mikeal.iriscouch.com/testjs/'+ rand);
  } else {
    console.log('error: '+ response.statusCode);
    console.log(body);
  }
})

Install it via npm, as always:

npm install request

Sponsor this blog series!

Doing a node.js company and want your ad to appear in the series? The ad will go out to 14,000 rss subscribers, 7,000 email subscribers, and it will get viewed by thousands of my blog visitors! Email me and we'll set it up!

See ya!

If you love these articles, subscribe to my blog for more, follow me on Twitter to find about my adventures, and watch me produce code on GitHub!

This article is part of the article series "Node.JS Modules You Should Know About."
<- previous article next article ->

node logoHey everyone! This is the third post in my new node.js modules you should know about article series.

The first post was about dnode - the freestyle rpc library for node, the second was about optimist - the lightweight options parser for node.

This time I'll introduce you to one of my own modules called node-lazy - lazy lists module for node.

Basically you create a new lazy object, and pump data into it via data events (it's an event emitter). Then you can manipulate this data via chaining by using various functional programming methods.

Here is a quick example. Here we create a new lazy object and define a filter that returns only even integers, then we take just 5 elements, then we apply map on them, and finally we join (think of threads) the result in a list:

var Lazy = require('lazy');

var lazy = new Lazy;
lazy
  .filter(function (item) {
    return item % 2 == 0
  })
  .take(5)
  .map(function (item) {
    return item*2;
  })
  .join(function (xs) {
    console.log(xs);
  });

You can return this object from your function and then later when someone pumps data into it via data events, it will do the computation.

For example, if you do this:

[0,1,2,3,4,5,6,7,8,9,10].forEach(function (x) {
  lazy.emit('data', x);
});
setTimeout(function () { lazy.emit('end') }, 100);

Then the output will be produced by the console.log once 5 elements have reached the bottom of the chain.

The output is: [0, 4, 8, 12, 16].

Here is a real world example from node-iptables (another of my modules):

var Lazy = require('lazy');
var spawn = require('child_process').spawn;
var iptables = spawn('iptables', ['-L', '-n', '-v']);

Lazy(iptables.stdout)
    .lines
    .map(String)
    .skip(2) // skips the two lines that are iptables header
    .map(function (line) {
        // packets, bytes, target, pro, opt, in, out, src, dst, opts
        var fields = line.trim().split(/\s+/, 9);
        return {
            parsed : {
                packets : fields[0],
                bytes : fields[1],
                target : fields[2],
                protocol : fields[3],
                opt : fields[4],
                in : fields[5],
                out : fields[6],
                src : fields[7],
                dst : fields[8]
            },
            raw : line.trim()
        };
    });

This code fragment takes output from iptables -L -n -v and converts it into a data structure for later use.

Here a new Lazy object is created from an existing stream - the iptables.stdout stream. Next, the special lines getter is called that splits the stream on \n char and converts it into a line-stream. Then this stream is mapped onto String constructor to convert it to a string. Next, first two lines are skipped via skip(2) and then all the other lines are converted into a data structure via map.

You can also create all kinds of ranges with node-lazy, including infinite ranges. Check this out, ranges.js:

var Lazy = require('lazy');

Lazy.range('1..20').join(function (xs) {
    console.log(xs);
});

Lazy.range('444..').take(10).join(function (xs) {
    console.log(xs);
});

Lazy.range('2,4..20').take(10).join(function (xs) {
    console.log(xs);
});

When you run it:

$ node ranges.js 
[ 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19 ]
[ 2, 4, 6, 8, 10, 12, 14, 16, 18 ]
[ 444, 445, 446, 447, 448, 449, 450, 451, 452, 453 ]

Here are all the possible ranges that node-lazy supports:

Lazy.range('10..')       - infinite range starting from 10
Lazy.range('(10..')      - infinite range starting from 11
Lazy.range(10)           - range from 0 to 9
Lazy.range(-10, 10)      - range from -10 to 9 (-10, -9, ... 0, 1, ... 9)
Lazy.range(-10, 10, 2)   - range from -10 to 8, skipping every 2nd element (-10, -8, ... 0, 2, 4, 6, 8)
Lazy.range(10, 0, 2)     - reverse range from 10 to 1, skipping every 2nd element (10, 8, 6, 4, 2)
Lazy.range(10, 0)        - reverse range from 10 to 1
Lazy.range('5..50')      - range from 5 to 49
Lazy.range('50..44')     - range from 50 to 45
Lazy.range('1,1.1..4')   - range from 1 to 4 with increment of 0.1 (1, 1.1, 1.2, ... 3.9)
Lazy.range('4,3.9..1')   - reverse range from 4 to 1 with decerement of 0.1
Lazy.range('[1..10]')    - range from 1 to 10 (all inclusive)
Lazy.range('[10..1]')    - range from 10 to 1 (all inclusive)
Lazy.range('[1..10)')    - range from 1 to 9
Lazy.range('[10..1)')    - range from 10 to 2
Lazy.range('(1..10]')    - range from 2 to 10
Lazy.range('(10..1]')    - range from 9 to 1
Lazy.range('(1..10)')    - range from 2 to 9
Lazy.range('[5,10..50]') - range from 5 to 50 with a step of 5 (all inclusive)

Install it via npm, as always:

npm install lazy

Awesome sauce!

If you love these articles, subscribe to my blog for more, follow me on Twitter to find about my adventures, and watch me produce code on GitHub!

Sponsor this blog series!

Doing a node.js company and want your ad to appear in the series? The ad will go out to 14,000 rss subscribers, 7,000 email subscribers, and it will get viewed by thousands of my blog visitors! Email me and we'll set it up!

This article is part of the article series "Node.JS Modules You Should Know About."
<- previous article next article ->

node logoHey everyone! This is the second post in my new node.js modules you should know about article series.

The first post was about dnode - the freestyle rpc library for node.

This time I'll introduce you to node-optimist - the lightweight options parser library. This library is also written by James Halliday (SubStack), my co-founder of Browserling and Testling.

Wonder how lightweight an options parser can be? Check this out:

var argv = require('optimist').argv;

And you're done! All options have been parsed for you and have been put in argv.

Here are various use cases. First off, it supports long arguments:

#!/usr/bin/env node
var argv = require('optimist').argv;

if (argv.rif - 5 * argv.xup > 7.138) {
    console.log('Buy more riffiwobbles');
}
else {
    console.log('Sell the xupptumblers');
}

Now you can run this script with --rif and --xup arguments like this:

$ ./xup.js --rif=55 --xup=9.52
Buy more riffiwobbles

$ ./xup.js --rif 12 --xup 8.1
Sell the xupptumblers

I know you want to buy more riffiwobbles and sell your xupptumblers.

Next, it supports short args:

#!/usr/bin/env node
var argv = require('optimist').argv;
console.log('(%d,%d)', argv.x, argv.y);

You can use -x and -y as arguments:

$ ./short.js -x 10 -y 21
(10,21)

Then node-optimist supports boolean arguments, both short, long and grouped:

#!/usr/bin/env node
var argv = require('optimist').argv;

if (argv.s) {
    console.log(argv.fr ? 'Le chat dit: ' : 'The cat says: ');
}
console.log(
    (argv.fr ? 'miaou' : 'meow') + (argv.p ? '.' : '')
);

And now you can invoke the script with various options:

$ ./bool.js -s
The cat says: meow

$ ./bool.js -sp
The cat says: meow.

$ ./bool.js -sp --fr
Le chat dit: miaou.

Next, you can easily get to non-hypenated options via argv._:

#!/usr/bin/env node
var argv = require('optimist').argv;

console.log('(%d,%d)', argv.x, argv.y);
console.log(argv._);

Here are use cases for non-hypenated options:

$ ./nonopt.js -x 6.82 -y 3.35 moo
(6.82,3.35)
[ 'moo' ]

$ ./nonopt.js foo -x 0.54 bar -y 1.12 baz
(0.54,1.12)
[ 'foo', 'bar', 'baz' ]

Optimist also comes with .usage() and .demand() functions:

#!/usr/bin/env node
var argv = require('optimist')
    .usage('Usage: $0 -x [num] -y [num]')
    .demand(['x','y'])
    .argv;

console.log(argv.x / argv.y);

Here arguments x and y are required and if they are not passed, the usage will be printed automatically:

$ ./divide.js -x 55 -y 11
5

$ node ./divide.js -x 4.91 -z 2.51
Usage: node ./divide.js -x [num] -y [num]

Options:
  -x  [required]
  -y  [required]

Missing required arguments: y

Optimist also supports default arguments via .default():

#!/usr/bin/env node
var argv = require('optimist')
    .default('x', 10)
    .default('y', 10)
    .argv
;
console.log(argv.x + argv.y);

Here x and y default to 10:

$ ./default_singles.js -x 5
15

Enjoy this stranger:

Alternatively you can use isaacs's nopt that can enforce data types on arguments and can be used to easily handle a lot of arguments. Or you can use nomnom that noms your args and gives them back to you in a hash.

If you love these articles, subscribe to my blog for more, follow me on Twitter to find about my adventures, and watch me produce code on GitHub!

Sponsor this blog series!

Doing a node.js company and want your ad to appear in the series? The ad will go out to 14,000 rss subscribers, 7,000 email subscribers, and it will get viewed by thousands of my blog visitors! Email me and we'll set it up!

This article is part of the article series "Node.JS Modules You Should Know About."
<- previous article next article ->

node logoHey everyone! I am starting a new article series called node.js modules you should know about. I have been using node for over 2 years now and I built Browserling startup using node so I know just about everything about it. I also have written about 20 node.js modules myself (see my github).

In this series I will go through a few dozen of node.js modules, give examples and explain where it's useful.

The first module in the series is dnode. Dnode is freestyle rpc library and it's written by James Halliday (SubStack) ― co-founder of Browserling and Testling.

Here is what it is. This is the server.js:

var dnode = require('dnode');

var server = dnode({
    mul : function (n, m, cb) { cb(n * m) }
});
server.listen(5050);

And here is the client.js:

var dnode = require('dnode');

dnode.connect(5050, function (remote) {
    remote.mul(10, 20, function (n) {
        console.log('10 * 20 = ' + n);
    });
});

Now when you run client.js, you get the output:

$ node client.js
200

See what it did? It called the mul function at server side from the client side and passed it arguments 10 and 20. They got multiplied at server side and the result got sent back to the client by calling cb.

It's important to stress that no code was passed along, all this happened purely through references. You can see the implementation dnode protocol in dnode-protocol github repo.

Here is a more complex example, where client calls server, which calls client again, which passes the result back to server, which then calls client and prints the result.

server.js:

var dnode = require('dnode');

var server = dnode(function (client) {
    this.calculate = function (n, m, cb) {
        client.div(n*m, function (res) {
            cb(res+1)
        });
    }
});
server.listen(5050);

client.js:

var dnode = require('dnode');

var client = dnode({
    div : function (n, cb) {
       cb(n/5);
    }
});

client.connect(5050, function (remote) {
    remote.calculate(10, 20, function (n) {
        console.log('the result is ' + n);
    });
});

When you run the client, you'll get result 41. Here is what happens. First you connect to dnode server at port 5050. Once you're connected, dnode client calls calculate function on server side and passes it arguments 10 and 20 and a callback function that prints the result. Now when the server receives the arguments 10 and 20, it multiplies them together and calls the client's div function, that divides the result by 5. The result is returned back to the server and it adds 1 to it and calls the original callback that prints the result.

We use dnode everywhere at Browserling. Every service is a dnode server and they are all interconnected. For example, the authentication is a dnode server. We can bring it down and update, while the rest of the site is up. Really awesome.

You can install dnode through npm:

npm install dnode

And since dnode has a well defined protocol, you can implement it in any language! Here are dnode implementations in Perl, Ruby, PHP, Java.

Enjoy this rapping turtle!

If you love these articles, subscribe to my blog for more, follow me on Twitter to find about my adventures, and watch me produce code on GitHub!

Sponsor this blog series!

Doing a node.js company and want your ad to appear in the series? The ad will go out to 14,000 rss subscribers, 7,000 email subscribers, and it will get viewed by thousands of my blog visitors! Email me and we'll set it up!