Hacker News new | past | comments | ask | show | jobs | submit login
Node.js - A giant step backwards? (fennb.com)
105 points by jemeshsu on Jan 25, 2012 | hide | past | favorite | 97 comments



Allow me to be a little grumpy:

I hate tabloidy headlines with question marks. As in "Queen Elizabeth: Is She a Transvestite?", "The Moon: Is It Made of Cheese?" or "Linkbait: Will It Ever End?".



Well sourced. Thank you.


I forget where I first heard this (the idea isn't mine) but I believe that in almost every case, the answer is "No." Then, the article usually goes on to explain why not.


Perfect example of linkbait, six months old article.


The big reason callback-based systems are hard (despite the fact that built-in flow control constructs need to be re-implemented) is that functions are no longer composable. That is, if I write a bunch of code that doesn't need to do any I/O, I'll just return a value to the caller. If at any point in the future the spec changes and any function in this call stack needs to do any I/O, all the code that ends up calling this function needs to be refactored to use callbacks.

There really should be language support for this sort of thing (like coroutines) so these sort of cascading changes don't need to happen.


> The big reason callback-based systems are hard (despite the fact that built-in flow control constructs need to be re-implemented) is that functions are no longer composable

Absolutely, I've had it happen in "big" (client-side) JS projects.

Only way I've found so far to handle this is to make anything which might ever have any reason to become asynchronous (so anything but helper functions) take callbacks or return deferred objects. Always.

But then an other issue arises: for various reasons, callbacks-based code which works synchronously may fail asynchronously, and the other way around. And then it starts getting real fun as you still have to go through all your (supposedly) carefully constructed callbacks-based code to find out what reason it would have to fail (alternatively, you create both sync and async tests for all pieces of callbacks-based code)


There are ways to compose functions, like the deferrable pattern. They just all kind of suck and are poor replacements for having a stack.

I'm all for using coroutines to solve this problem. That's the approach taken by my Celluloid::IO library:

https://github.com/tarcieri/celluloid-io

Unfortunately Ryan Dahl is adamantly opposed to coroutines so that's not going to happen in Node any time soon.


I like Kris Zyp's promises and use them a lot:

https://github.com/kriszyp/promised-io


Deferreds make your code nicer, but they still don't magically make your code composable.

What would make Node more attractive is if it supported copy-on-write multithreading and gave me a way to cheat and use asynchronous I/O (like a wait(myFunctionThatTakesACallbackOrDeferred) function)


There are many ways to compose deferrables, primarily by grouping several of them together into something which is also a deferrable. See the em-scenario gem for examples:

https://github.com/athoune/em-scenario

Note: I still think this approach sucks.

V8 provides a really awesome shared-nothing multithreading scheme via web workers. It's just nobody uses them.


Oh right that's definitely true and is much more elegant. I was talking about when a function written in synchronous style (in a long stack of synchronous calls) needs to call something asynchronous.


I wouldn't say the problem with Deferreds is composability - the big problem, as you already mentioned, is fragility towards future changes


Node supports no language extensions that V8 doesn't. Ryan Dahl says:

"Node does not modify the JavaScript runtime. This is for the ECMA committee and V8 to decide. If they add coroutines (which they won't) then we will include it. If they add generators we will include them."

See Tim's thread, My Humble Coroutine Proposal (https://groups.google.com/forum/#!topic/nodejs/HJOyNMKLgB8). Warning: long.

So, you don't like javascript? Don't use it.

But you won't beat the control over program state offered by javascript -- not until computers understand their programmers well enough to reason about program state. We need better computers, better computer science, and better programmers -- that's all. Then we can replace NodeJS with something better.

Until then, NodeJS is almost certainly the most tasteful solution to the most common problems. I hope its replacement meets so high a standard.


Coroutines on a server...like OpenResty [1] (a package containing Nginx w/Lua integration)?

The example "asynchronous get" becomes something like:

    local myObject = query{ id=3244 };
The query function can be written to handle a cache lookup, a database query that gets stored in the cache, and any other logging you want, because Lua has coroutines. All I/O that gets sent through the "ngx" query object (which can connect to local or remote ports) yields control to the main loop.

"query" is an example of a function you could create; its implementation (with a cache lookup) could look something like (yes, I use CouchDB...):

    function query(t)

      local result = ngx.location.capture(  "/cacheserver/id:"..t.id );
      if #result == 0 then
        result = ngx.location.capture(  "/couchdb/usertable/"..t.id );
      end

      return result
    end
I've heard reports of 50k+ connections/second on a VPS running Nginx, LuaJit, and the LuaNginxModule, and on my low-end VPS it easily handles 2000+ connections per second (with CouchDB queries) with no more than 250ms latency. Actually, that was as many connections I could send at it, so it may be able to handle a lot more.

[1] http://openresty.org/


I'm guessing monads will be a potential solution implemented pretty soon. I'd be interested in seeing how that works out.


There is already a Monad based solution - its Deferreds / Promises.

However, monads don't solve this problem - they cause it, since their primary concern is correctness and not converting between monadic and non-monadic code. If you have a pure function and need to convert it to a monadic action there will be lots of collateral damage as functions that interacted with the old function have to be converted to monadic style.


Yeah, just after I wrote that I realized that Deferreds/Promises are monads.

You don't have to convert them, but you do need a way to write code using those pure functions "inside" the monad which I don't think is easy with this model.

You should be able to just chain your actions together, with a wrapper function to turn pure functions into actions. I don't see why you would need to convert the actual pure functions to monadic style.

I guess what I had in my head when I made the comment was the convenience functions or the syntactic sugar around a monadic solution. Seems like the node guys are playing with things like this but all the solutions I've seen seem so ugly.


Pure code can be used in the monad just fine, (ie.: you can return values up to the monad and lift functions)

    //turn a function into a monadic action
    function lift(f){ return function(/**/){
        var d = new Deferred(); //Im using the Dojo API
        d.resolve( f.call(this, arguments) );
        return d.promise;
    };};
    
    var f = function(x){ return x+1}
    
    lift(f)(0)
    .then(function(y){ ... })
The problem is the opposite direction: pure code using monadic code and pure code turning into monadic code

If you have something like

    var x = f1( f2( f3() ) )
And f2 becomes a monadic action you have to rewrite this bit as

    var xPromise = f2( f3() )
    .then(function(f2result){
        return f1(f2result);
    });


ah, very nice, it is easy. The chaining is still fairly clunky though, although maybe comparing it to haskell's do notation syntactic sugar is unfair.

(isn't the opposite direction the feature of doing things this way? That you can't call monadic code from pure code is a good thing)


Callback-centric code can be composable if you adhere to the continuation passing style (CPS) discipline. While it can be difficult to write CPS code directly, it's a well-trodden path in computer science.


And the people who trod that path all came back and told us that it's a worse way for humans to write and reason about code.

Not that it can't be used under the hood of course.


In this sense, I think the approach of extending Javascript with async features and compiling down to CPS (ala http://tamejs.org/) has a lot of promise.


Yeah that's true, but using CPS for everything means you can't really use normal flow control anywhere which IMO makes it inappropriate to use for day-to-day engineering.


I have been using node.js for my day-to-day engineering for the past year. It certainly takes a couple of weeks to get a good feel for solving problems without being able to use normal flow control. But once you get it becomes easy. So I wouldn't call in inappropriate, just different. It's much like the feeling of learning your first functional programming language. It's not wrong, just different until you adjust the way you reason about your programs.


CPS is also generally used as an IR in compilers, not something you write by hand


Yes. In Scheme, call/cc is generally not used directly. Instead, you build higher level abstractions, for example full co-routines or perhaps more limited "generators" like Python has. Co-routines have been around for more than 30 years. It's quite said to see the Node.js developers ignore decades of language research. Implementing concurrency using callback functions is definitely a "worse is better" approach. Maybe it will win because of that.

Edit: I guess a hopeful idea is that there should be no reason something like call/cc could not be added to Node.js. In that case, the extensive library of non-blocking functionally will be very handy since you could build a sane continuation based interface on top of it and escape from callback purgatory.


Node has coroutine support via the fibers package: https://github.com/laverdet/node-fibers. For some benchmarks, check out my Common Node project's README: https://github.com/olegp/common-node


In my experience, node.js is a terrible choice for most web applications. If you really need real-time, sure, go for it. But if you're interested in node as an alternative to rails/sinatra/django/flask, then stay away from it. The cool stuff you get for free like coffeescript/jade/stylus can also work with the ruby and python frameworks.

Node sucks for general web apps because you have to program everything asynchronously. This is a major step backwards, and quite frankly it feels to me like trying to program in assembly. It's not expressive at all. You have to write your program in some pseudo code, then translate that to async code. And what for? What's the advantage you'll get? scalability? Who says you will need it? This is exactly where you should remember that premature optimization is the root of all evil.


I can second this. When I first found out about node, I wanted to write all my web apps in it. I quickly found out that made no sense at all.

It might be a bit more performant than Django (my go to framework), but the amount of time it took me to produce the same work made it impractical.

The one web app situation I would use it for is when I have to make a request to a couple of databases/caches/etc, and each response isn't dependent on the others. That would make node far more performant than a sync framework. But this situation rarely seems to arise.


> Node sucks for general web apps because you have to program everything asynchronously.

To be fair, the exact problem is not async itself, but forcing CPS (continuation-passing style) for serial routines. For example, gevent and eventlet use greenlets (coroutines for Python) to avoid unnecessary callbacks in serial routines.


I'm curious -- how would the author's example look using greenlets?


This is really old - discussion from 6 months ago: http://news.ycombinator.com/item?id=2848239

Basically, async I/O gives you more options than "block the whole world while you go read this stuff", and that means that old idioms aren't effective.

You gain more control: can choose when to block, when to limit concurrency and when to just launch a bunch of tasks at the same time. At the same time, you need to adopt a few new patterns, since you can't/don't feel right blocking execution every time you use an external data source. It's definitely a tradeoff and not a magic bullet.

For my longer take on this, see http://book.mixu.net/ch7.html


You don't get to choose when to block. Blocking becomes an error which hangs the event loop. The only option for blocking calls is to use a thread pool that sends events back to the event loop when a thread finishes running.


You're right, I should've said "emulate blocking by checking that all of the tasks we queued have completed upon finishing a task before proceeding while allowing the event loop to run" instead of "blocking". It's not really blocking the event loop, only a controlling the flow of a particular path of execution. Only a few native API's provide synchronous versions that will block the entire process until they complete - like the filesystem API's fs.readFileSync.


Yeah, as ridiculous as it sounds, this sort of post had a definite time limit. It's not really relevant at this point, aside from historical purposes.


While I agree that event programming is very different, some of the issues the author brings up can be dealt with by architecting a program for an event based system. I understand that, is the authors gripe. That it is sometimes hard for someone coming from a non event control flow background to adapt at first. But items like the loop example are examples of mixing half control flow and half event based programming. What should be done in that situation is that it should not be returning a list, it should be returning a promise that will get called on completion of the list. Or a more elegant solution would be to notify listeners when a new item of the list is parsed so that they can observe it and see if it is an item that they are interested in. I understand the authors frustration, but it appears to me, that some more articles on best practices would help bring clarity on how to deal with these form of patterns.


I hope this isn't too pedantic, but I wish the author would stay away from calling Node "concurrent". The whole point of Node is that it's asynchronous but not concurrent: there is a single thread of execution for your whole program, which is what lets you ignore locking, etc.

In fact, when you program for Node it's really important to keep this in mind, since (contrary to another statement from the article) not all libraries are asynchronous. If you select a synchronous db driver or write a long-running loop, it will block the rest of your program.

In general, though, I thought it was a good piece. I'm sure many heads have exploded on first introduction to node (and JS in general).


You're confusing concurrency and parallelism. Concurrency describes having several tasks or operations that contend on resources or events. They may or may not execute in parallel, and certainly don't in Node. In contrast, Netty provides a thread pool for executing events and thus provides concurrency and parallelism.


As far as I know, by definition (coming from someone with a predominately Java background) concurrency requires multi-core processors, or, for example, a distributed application with at least 2 nodes that communicate with the same central server / "hub" (which would still require a multi-core processor, as far as I know, on the server). The central tenet mentioned with concurrency is often that of "race conditions" where it cannot be predicted which thread or node will access a resource first. If a task isn't executed in parallel, such as it has to wait for the other to finish before it starts / proceeds, I don't see how it could cause a race condition. Admittedly though, I don't have deep knowledge of how high level code translates to actual CPU instructions so it could be possible, or even likely, that if the processor is switching between tasks in such a way that each line of code, so to speak, that is run is from a different method that concurrency issues would occur even on a single-core processor. A language like Java has robust and mature aspects of the language to deal with this so I would be wary of using something like node.Js if this wasn't well documented. Concurrency issues are nasty and I have seen firsthand the mayhem they can cause in legacy applications I've worked on.


Concurrency doesn't require multi-core. Parallelism requires multi-core.


I certainly don't like to go around spouting nonsense, so I spent some time looking for formal definitions of concurrency. I wasn't able to find any support for the description you provided, though. Do you have a source?

FWIW, Wikipedia seems to believe that "concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other".

Regardless, sorry if sloppy (or poorly defined) terminology obscured my point.


If you quote the Wikipedia definition with a bit more context you'll see it matches mine:

"In computer science, concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other. The computations may be executing on multiple cores in the same chip, preemptively time-shared threads on the same processor, or executed on physically separated processors."


Node.js is concurrent - you can have multiple paths of execution at the same time. It's just that Node.js concurrency has non-preemptive context switch at well defined points - function boundary, thus no need for lock, whereas preemptive concurrency can context switch at any point and thus needs lock.


There are multiple paths of execution your code but they are never running concurrently. Only the underlying I/O is really executing concurrently with your code.


Multiple paths of execution is concurrency. I guess what you intended to mean was parallelism, which is different from concurrency.

A multi-threaded program running on a single core system has only one path executed at any given time, but we still call it a concurrent program.


Concurrency is a superset and includes parallelism. What we both really mean is that it uses at most one core at a time to execute your code.

http://en.wikipedia.org/wiki/Concurrency_(computer_science)


_That_ I did not know. Thanks!


Node is concurrent. libuv spawns threads and uses unix and windows APIs to read files, and pass messages back to the main JS thread when tey're done. So while your JavaScript code only runs in one thread, threads are used under the hood, making it concurrent.


> If you select a synchronous db driver

There is no such thing for node, unless you use a compiled addon. Node has no blocking networking facilities.


And even compiled add-ons use the thread pool to be async.


This guy doesn't know much about javascript, I am guessing. I made some gists that take his code and add minimal changes to it, that fix the problems he complains about:

"Two different code paths, can't do DRY" really? https://gist.github.com/1678395

"Oh noo, I can't return the results because they are async". That's what callbacks are for. You know what you CAN do? Do I/O in parallel that's what! Node makes it easy. https://gist.github.com/1678415

Anyway I hope this illustrates the point. The guy says it exactly right in one place: "Once you get your head around thinking in async terms, node.js starts to actually make a lot of sense." And therefore it is not a giant step backwards.

There are more elegant ways to write this (see http://qbix.com/plugins/Q/js/Q.js) but these are just minimal changes to his own code.


I tend to agree with the post, but I find the one-language-to-rule-them-all thing too compelling to fret too much over asynchronicity. Although it'd probably lead to lots of synchronous code, it would be nice for it to to be easier in nodejs to be synchronous sometimes and asynchronous sometims.

I would refactor the code to something like (still not as simple as synchronous):

    asynchronousCache.get("id:3244", function(err, myThing) {
      var useResult = function(err,_myThing){
          // We now have a thing from DB, do something with result
          // ...
      }
     
      if (myThing)
         useResult(null, myThing);
      else
         asynchronousDB.query("SELECT * from something WHERE id = 3244",useResult);


The problem is that the author simply didn't notice the refactoring and abstraction opportunities available. (One question to ask yourself: "How do I test this?" If you can't answer that question, the code is wrong.)

We'll start with the synchronous example:

    myThing = synchronousCache.get("id:3244");
    if (myThing == null) {
      myThing = synchronousDB.query("SELECT * from something WHERE id = 3244");
    }
This is verbose and tedious. We should really make the API look like:

    myThing = database.lookup({'id':3244}, {'cache':cache_object});
Let's apply this idea to his asynchronous example. We want the code to look like:

    database.lookup({'id':3244}, {'cache':cache_object}, function(myThing) {
        // whatever
    });
So instead of writing this:

    asynchronousCache.get("id:3244", function(err, myThing) {
      if (myThing == null) {
        asynchronousDB.query("SELECT * from something WHERE id = 3244", function(err, myThing) {
          // We now have a thing from DB, do something with result
          // ...
        });
    
      } else {
        // We have a thing from cache, do something with result
        // ...
      }
    });
We need to refactor this. Remember, node.js is a continuation-passing-style language. So let's set a convention and say that every function takes two continuations (success and error).

Then, to compose two functions of one argument:

   function f(x, result, error)
   function g(x, result, error)
To:

   h = f o g
You write:

   function compose(f, g){
       return function(x, result, error){
           g(x, function(x_){ f(x_, result, error) }, error);
       }
   }
(Data flows right-to-left over composition, so "do x, then do y" is written: "do y" o "do x".)

Now we can cleanly write a complex program from simple parts. We'll start by creating a result type:

    result = { 'id': null, 'value': null, 'not_found': null }
Then, we'll implement cache functions that take keys (as results of this type) and return values (as results of this type). Looking up an entry in cache looks like:

    cache.lookup = function(key, result, error){
        new_key = key.copy();
        cache.raw_cache.lookup(key.id, function(value){
            new_key.result = value;
            new_key.not_found = false;
            result(new_key)
        },
        function(error_type, error_msg){
            if(error_type == ENOENT){
                new_key.not_found = true;
                result(new_key)
            }
            else {
                error(error_type, error_msg);
            }
        });
    };
Looking up an entry in the database looks about the same. The key feature is that the "return value" and the "input" are of the same type. That makes composing, in the case of "try various abstract storage layer lookups in a fixed order", very easy. (Yes, the example is contrived.)

    dbapi.lookup = function(key, result, error){ ... };
 
Now we can very easily implement the logic, "look up a value in the cache, if it's not there, look it up in the database":

    cached_lookup = compose(dbapi.lookup, cache.lookup);
    cached_lookup(1234, do_next_step, handle_error);
You can, of course, generalize compose to something like:

    my_program = do([cache.lookup, dbapi.lookup, print_result]);
Writing clean and maintainable code in node.js is the same as writing it in any other language. You need to design your program correctly, and rewrite the parts that aren't designed correctly when you realize that your code is becoming messy.

Continuation-passing style is pretty weird, but you do get some benefits over the alternatives. Writing a program with coroutines involves deferring to the scheduler coroutine every so often, littering your code with meaningless lines like "yield();". Using "real" threads is even worse; your code looks like single-threaded code, but different parts of your program are running concurrently. (Did you share any non-thread-safe data structures, like Java's date formatter? Hope not, because you won't know you did until the production code dies at 3am.) Continuation-passing style lets you "pretend" that you are executing multiple threads concurrently, but the structure of the code ensures that only one codepath is running at a time. This means that libraries that don't do IO don't have to be thread safe, since only one "thread" runs at a time.

All concurrency models involve trade-offs over other concurrency models. But when comparing them, make sure you're comparing the actual trade-offs, not your programming ability with each model.


I think you've actually demonstrated how something that would be really simple in python, gets horribly complicated in javascript. Or maybe it's just me...


If you use a continuation passing style in Python, then the code looks about the same. Most Python programmers use threads (and let the GIL give them a bit more thread safety than C++ and Java programmers get) or Twisted (with Deferreds).

I think you'll write better JavaScript if you know Python because Python encourages you to use named functions instead of lambdas. JavaScript fanbois get very excited about anonymous functions and overuse them; Python doesn't let you use anonymous functions for anything useful, so you tend to name things. (object.method is also nice syntax for working with callbacks.)

Anyway, Python and Node feel about the same to me, except for the fact that Python has nicer syntax.


No, it's definitely not just you. This seems way too difficult for trying to accomplish something so simple.


Great takeaway, jrockway. For his other example, I encourage people to take a look at caolan's excellent async library:

  async.waterfall([
    function(callback) {
      async.map(ids, db.getById, callback)
    },
    function(posts, callback) {
      callback(null, posts.map(templating.render))
    }
  ], function(err, results) {
    console.log(results);
  });
EDIT: Fixed example.


Personally I've gotten more use out of the Futures library, though both are great. https://github.com/coolaj86/futures It contains a form of 'waterfall' known as merely a sequence. I use a lot of sequences and promises, it makes me not go insane from NodeJS programming. In the end I still think it was a mistake for Node to not support any sync features because there are times it's nice to have. TameJS doesn't solve the problem completely either (and I hate the compile step).


Impressive example. However, as hinted at in other comments related to composing, this style of composition is really quite hard to hold in your head all at once.

It's certainly not a path I would relish having to follow, and I would not expect everyone to be able to program at this level.

I'd certainly not go so far as to criticize their programming ability for not being able to intuitively do this.


Dude... are you converting from Perl/Catalyst? :)


Nope! But I play with programming languages for fun.


Transforming functional or imperative code into continuation-passing style isn't that big a deal. If javascript weren't such a pain in the ass just to parse, there would probably be tools to do that. Maybe coffeescript will do it, but this is why macro-extensible languages are a big win—I'd have the right hooks to easily do it myself rather than waiting for the implementors to officially update the language (or get elbow deep in their internals and hope they accept a huge patch).


There's already a branch of CoffeeScript, called Iced CoffeeScript (http://maxtaco.github.com/coffee-script/), that does this. The example in the post would look like:

  for blogPostId in recentBlogPostIds
    await asynchronousDB.getBlogPostById blogPostId, defer(err, post)
    templating.render post
Though you still can't simply return the result — you'd have to use a continuation — but it does make it simple enough to use CPS in general.


To make things easier, you can use node-fibers (https://github.com/laverdet/node-fibers) to structure your asynchronous code with coroutines or, if you are feeling less adventurous, async (https://github.com/caolan/async) is an excellent helper library for common asynchronous code patterns and it works on the client-side as well.


This callback programming is the reason i quit node.js. It is easy to get something up and running, but then it feels like i never get out of the chaos i created.

This whole thing additional to the whole mess JavaScript is? I never liked it to begin with, but there is no real alternative until Dart is ready. Every time something comes out for JavaScript it adds another abstraction and chaos in my opinion. jQery for example, really impressive to begin with, but when you see what a horrible mess you can create with it...

There is a reason why big companies never adopt these things, i can't imagine how it would be to take over a node.js app from someone else.

Now you can argue that this takes practice. Crockford may write JS from heaven, but i don't want to invest my time in this language. These inconsistencies are not fun to deal with and when Dart is here, companies will drop it very fast.

I am now stuck with Scala, it is the complete opposite. It is complicated to get in, but when you get it, you have a gigantic toolbox to solve every problem the way you want. For web programming i recommend Lift, but when you want to get in fast and a fan of async try Play2.0. Node.js made async popular, it should get credit for that.


> There is a reason why big companies never adopt these things, i can't imagine how it would be to take over a node.js app from someone else.

That's incredibly inaccurate. Big companies adopt a lot of crazy things, craziness isn't much of a deciding factor. Node.js is used by plenty of large companies and despite your tastes, Javascript in general is ridiculously popular in companies of all sizes.


References please, clientside doesn't count. Google forbid it and i don't know of any big company how has SSJS in production. It's definitively not ridiculously popular.


Apparently Walmart also uses it for mobile: http://venturebeat.com/2012/01/24/why-walmart-is-using-node-...


LinkedIn uses it for their mobile platform. I know some pretty big companies are looking into it. I know there are sizable companies out there using it in some capacity: https://github.com/joyent/node/wiki/Projects,-Applications,-...


Use recursion. Fixed:

  asynchronousCache.get("id:3244", function doThing(err, myThing) {
    if (myThing === null) {
      asynchronousDB.query("SELECT * from something WHERE id = 3244", function(err, myThing) {
  		// We now have a thing from DB, do something with result
  		doThing(err, myThing);
      });

      return;
    } else if(err !== null) {
  	// Handle error.
  	return;
    }

  // We have a thing.
  });


Were you implying that your solution was simpler?


Tame JS (tamejs.org) makes asynchronous code in node.js very very easy.


Thank you!


I thought that this was going to be a node.js bashing.

What a let down.


The problem is, Javascript continuation syntax is ugly and verbose. All the nested indentation fails to map to our human sensibilities about what the code is actually designed to do.

Someone needs to fix this


I am not convinced with the blog post example. If ordering and null entries really matter, then you must put appropriate instructions to handle them. Ordering can be tackled by adding the blogPostId to each entry, and sorting the resulting collection with this key (assuming that you don't pick up a million+ posts).

I advocate polyglot programming, and using node.js for tasks other than what its designed for (server side async programming) might result in unfavorable results.

iPods are not lousy because one can't text with them.


My take on this post is that the author was calling Node a giant leap backwards because he believed all code should have the look and feel of Python.

But now, he's not so sure.

Did I miss anything?

ETA: I understand that coding for Node looks and feels weird, but so does coding for Lisp, Smalltalk, Haskell and a long list of other programming languages.


The problem is that the problems Node solves, can be solved better and more cleanly than a spaghetti code of callbacks, with coroutines for example. It's just that Node and javascript are not up to the task.

Now, the thing Node has going for it, is that, despite being inferior than other similar technologies, it has a big following (community matters), lots of libs (libs matter), and it's based on an easy and familiar language to many.


I kinda like the em-synchrony for Ruby. It handles the callbacks with fibers, so my actual code doesn't have the callback hell of Javascript. Although the implementation of Ruby fibers is not-so-nice at this point. I hope they'll fix it in the next versions.


em-synchrony's problem is it has people wrap asynchronous libraries one-at-a-time. There's a few problems with that:

You're exposing a synchronous API, but still can't take advantage of the huge ecosystem of Ruby libraries that already expose synchronous APIs.

Wrapping libraries becomes a one-off chore. Each individual library must be wrapped to work in an em-synchrony system, and if the libraries aren't both asynchronous and wrapped in fibers you can't use them. This not only shrinks the ecosystem of libraries further, but is also more error-prone than providing a general coroutine abstraction around socket IO.

Providing a generalized abstraction for doing synchronous I/O with sockets/fibers and an evented backend is exactly what I'm working on in Celluloid::IO:

https://github.com/tarcieri/celluloid-io


This is a very interesting project. I'll have to dig deeper into it soon.


Aync codes like threaded codes are different from the simple sync code. These different style coding are there to take advantage of the concurrent benefit of the system. Developers with simple single flow control code background often complain about the extra flow control complexity when it's outside of their comfort zone. Think of it as a level up on your skill.

There are libraries out there that add syntactic sugar to make async code look like sync code. Like,

    group (
        asyncfunc1()
        asyncfunc2()
        asyncfunc3()
    )


I think my biggest gripe is that these things are only possible by using one of the many libraries or rolling your own solution. The rather disorganized state of the node.js libraries is far too confusing for most of us who don't do Node 24x7.


Agreed. The solution needs to be part of the language, or at least node itself, not a 3rd party library (or 10)


It takes some work to get used to the change in flow control, but it seems to be worthwhile (at least it has been so far for my project). Coming from a real time/embedded background seems to have helped me because typical those systems are heavily event based. I don't know if node.js will change the world but I think it's worth at least playing with just to get some experience with the programming model.


    getPosts (ids, cb) ->
        res = []
        stash = (err, post) ->
          res.push post
          cb(res) if res.length is ids.length
        db.getPostById(id, stash) for id in ids
        
    getPosts [...], (posts) ->
        # go on...
use `res[i] = post` if there is some implicit ordering.


Declaring node a 'giant' step backwards is a stretch. Callback spaghetti isn't the problem it set out to solve. It is meant to provide easy(-er?) concurrency. If you measure it against its goals, I think it's pretty good.


I wrote a module based on fibers to help with these sort of problems. Take a look at https://github.com/scriby/asyncblock


This is a rather sensational article. The author's first example is pretty easily solved:

  getFromCache = function (id, callback) {
    asynchronousCache.get(['id', id].join(':'), function(err, myThing) {
      if (myThing == null) {
        asynchronousDB.query("SELECT * from something WHERE id = $id", {id:id}, function(err, myThing) {
          callback(myThing);
        });
      }
      else {
        callback(myThing);
      }
    });
  };

  getFromCache(3222, function (myThing) {
    console.log('myThing:', myThing);
  });


Or if you want more re-usability:

  getFromCache = function (id, query, callback) {
    asynchronousCache.get(['id', id].join(':'), function(err, myThing) {
      if (myThing == null) {
        asynchronousDB.query(query, function(err, myThing) {
          callback(myThing);
        });
      }
      else {
        callback(myThing);
      }
    });
  };

  getFromCacheSomething = function (id, callback) {
    var query = buildQuery("SELECT * from something WHERE id = $id", {id:id});
    getFromCache(id, query, callback);
  }

  getFromCacheSomething(3222, function (myThing) {
    console.log('myThing:', myThing);
  });


Don't forget to bubble errors up the callback chain along with return values (I always forget that too):

    getFromCache = function (id, query, callback) {
      asynchronousCache.get(['id', id].join(':'), function(err, myThing) {
        if (myThing == null) {
          asynchronousDB.query(query, function(err, myThing) {
            callback(err, myThing);
          });
        }
        else {
          callback(err, myThing);
        }
      });
    };


Why the downvotes?


I rarely use ifs and whiles using javascript.. There are better, higher level libraries that take care of it for you. As a bonus, some of them let you make it * paralleled* with the same syntax. Obviously, when you switch to a new language, you need to learn their new paradigms / designs.


Rarely use ifs?

What do you do if you want to see if, e.g. there are any results for the user's query?

I would do

if (results.length > 0)

or something


> As a bonus, some of them let you make it * paralleled* with the same syntax.

JavaScript is single-threaded, so parallelism within a single JS VM is not possible.


Hence the *




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: