[ Also on InfoWorld: 13 open source development projects making waves in the enterprlse. See "Open source programming tools on the rise." | Keep up on key application development insights with the Fatal Exception blog and Developer World newsletter. ]
I enjoyed the challenge and stimulation of rethinking everything I know about the server, but I still found myself hesitant to push these new ideas too far or too fast. These servers are fresh from the lab and made for experimenting, not building an application for Grandma to check the interest on her CDs. The ideal project for a corporation might be a temporary website for a one- or two-day event that would come and go in a flash. For now, enjoy creating something new and fun with them, not betting your business.
The most powerful idea is that Node.js is light, whereas alternatives such as Java are heavier. The secret of the tool's success seems to lie in one factoid often repeated by Node.js lovers: a Java server uses 2MB of RAM just to create a thread. As the standard Java Servlet container creates one thread for each request, it's clear that a fairly hefty server with, say, 8GB of free RAM can handle only 8,000 people. Of course the threads often use more memory, which further cuts into the overhead and positions 8,000 as an upper limit.
Threads were supposed to be lightweight ways for a processor to juggle the workload, and they were certainly successful back when people were satisfied with handling several thousand simultaneous users. But when people started counting up the costs of the overhead for bigger and bigger websites, some started wondering if there was a better way.
Node.js is one good solution. It uses only one thread for your server and everything runs within it. When the requests come flying in, Node.js takes them one at a time and hands them to the single function that was specified when the server is invoked. If you thought that a Java Server Page, a Java Servlet, or a PHP file was a lightweight way of building a website, you'll be impressed with the efficiency of this:
What? If you're going to point out that a JSP or PHP file may be as simple as the words "hello world," stop right there. You have to think beneath the surface and remember everything that the Java Servlet container or PHP server does for you. You may just write "hello world" in a JSP, but Java will burn 2MB of RAM creating a thread that supports the code that will eventually output the thread "hello world." The JSP might seem simple, but it's not.
Node.js does very little except grab the incoming request, call the function website, and marshal the results out the door. This single-mindedness lets it juggle all of the requests hammering at the port and dispatch them quickly.
I've seen standard-issue desktop machines easily handle thousands of requests more or less simultaneously. The data goes in and out like lightning because everything is handled in RAM and probably in the cache. Simple websites are surprisingly efficient.
But it's important to recognize that some of this lightning speed comes from leaving out other features. Running everything in one thread means everything can back up if that thread gets overloaded. All of the work that Java spends on putting clean, fresh sheets on the bed really pays off if one thread takes a long time to finish.
To make this happen, I created a simple server that takes a value "n" and adds up all of the numbers between 1 and n. This, by the way, is a purely CPU-bound operation that should use only two registers. It can't get hung up by waiting for RAM or the file system. The server was just as fast as before. I had to feed my underpowered desktop (1.83GHz Intel Core Duo) numbers like n=90000000 before it seemed to pause at all. That's a 9 with seven 0s after it. The answer had 16 digits in it.
When I fed fat numbers to the server, I found that all of the other requests would get in line behind it. When the workload is short, Node.js seems to be multitasking because it gets done with everything so quickly. But if you find an item that weighs down the server, you can lock up everything in a queue behind it.
Fear not. If this happens, Node.js lovers will blame you, not the machine. Your job as a programmer is to anticipate any delays, such as a request for a distant Web service. Then you break your code into two functions, just as AJAX programmers often do on the client. When the data is returned, Node.js will invoke the callback function. In the meantime, it will handle other requests.
In the right hands connected to a right-thinking mind, the results can be staggeringly efficient. When the programmer spends a few minutes and separates the work done before and after a delay, there's no need for the machine to tie up RAM to hold the state in the thread just so it will be ready when the data finally shows up from the database or the distant server or even the file system. It was undeniably easier for the programmer to let the computer keep all of the state, but it's much more efficient this way. Just as businesses try desperately to avoid tying up capital in inventory, a programmer's job is to think like a factory boss, treat RAM as capital, and avoid consuming any of it.
I found it pretty easy to build Web pages using the technique. Anyone who is new to AJAX will discover it's much more convenient to let Jaxer handle all of the background work of bundling and unbundling the data. It's all mostly automatic and even simpler to use than some of the AJAX libraries such as jQuery.
It's not clear how much support Jaxer is enjoying these days. The server used to be bundled with Aptana's other offerings, but now it's left alone in a corner. My guess is that many people don't need that much help handling AJAX calls now that libraries like jQuery simplify the process.
The ideal job for Jaxer will be one where most of the work is done on the client but some crucial part must run on the server. It's very easy to make code run on the server, but it's not so easy to write complex server-side code. There are plenty of jobs like this, and the people coding these jobs are the ones that Aptana is targeting when it says you can write an entire Web application in one file.
Leaving college and entering the job market? Learn from those who've gone before.
Here are 8 of the best desktop environments, ranked in inverse order -- saving the best (according to...
If you enjoy a sharply-worded insult, read on. This slideshow’s for you.
Evaluating whether or not to outsource your company’s project management functions can be challenging....
Asus, Acer, Lenovo, HP and Toshiba will use the Carrizo chips for mainstream laptops
The company's change follows an industry move to retire the use of the SHA-1 hashing algorithm
The new functionality will let Linux users remotely command Windows computers and vice versa