Hacker News new | past | comments | ask | show | jobs | submit login
IBM unveils computer fed by 'electronic blood' (bbc.co.uk)
95 points by a_w on Oct 19, 2013 | hide | past | favorite | 35 comments



I can't be the only one who thought of Ghost in The Shell while reading this. If anything, this is a way to explain away the presence of white 'blood' in the cyborgs of the series.

I can understand the power issue, but I wonder why 3D chips aren't pursued more. It can't be just the cooling since liquid cooling (as mentioned) is in use. Maybe inefficiencies in power distribution and distance to adjacent junctions? If so, parallel execution should help with some of the delays.


I love GitS, but when I see "white blood" I think of Bishop and the other androids from the Alien series.


From the end of the article:

"But all of the above will not get electronics down to the energy-efficiency of the brain.

"That will require many more changes, including a move to analogue computation instead of digital.

"It will also involve breakthroughs in new non-Turing models of computation, for example based on an understanding of how the brain processes information."

What exactly does all this mean? How would moving from digital to analogue computation improve energy efficiency? And aren't the whole Turing vs. non-Turing models of the brain still up for serious debate?


Analog computing is dramatically more energy-efficient than digital because it needs less hardware. For example, you can store an analog value in a capacitor while storing a digital value requires a multi-bit register that is made of multiple transistors per bit. Analog multiplication can be done with an amplifier while digital multiplication requires thousands of transistors. The tradeoff is that digital computation is exact while analog computation contains noise which may accumulate through the computation.


So is there a particular difficulty in overcoming this noise, or is it more just unexplored territory?


Definitely not unexplored territory, analog computation came first. Notably, the way you program an (electronic) analog computer is by adjusting it's circuitry.. not exactly the friendliest programming model.

The problem with noise seems to be repeatability. Enough noise, and reliability becomes a huge problem. With same inputs and analog circuit, you want the same answer every time, don't you?

You cannot "overcome this noise" by removing it. The perfection of materials, process and environment required for that would be on the order of an experiment like this: https://www.simonsfoundation.org/quanta/20131010-neutrino-ex...

We overcame the noise via digitization. In fact, there is obviously still noise in our current digital computers, since the components within them are fundamentally analog, but digital circuits quantize the analog signals, interpreting the 1's and 0's, despite their analog nature. (this is simplified and I know next to nothing about digital circuit design)


I think the main potential in analogue computing is to create complex networks of feedback loops where different regions of stability correspond to different machine states. I've seen models of neural network memory where the interconnection of neurons works like a combination of a symmetric linear transform and an amplifier followed by vector normalisation. The transform maps the sensory input into a reduced dimensional space (where each dimension corresponds to a possible memory). The reduced vector is amplified via the neural response function, and then it's transformed back to the sensory input vector space through an inverse to the original transform. That creates a feedback loop where (because of how the neural response function works) whatever the input is, the system converges to a vector that corresponds to exactly one of the memory vectors. It basically picks out and amplifies the closest memory to the sensory input.

That kind of system is a huge simplification, but similar things could be done with analogue computing. In particular, I think probabilistic computing could be done by setting up network feedback loops corresponding to underlying Bayesian networks, where stable points correspond to highest likelihood parameterisations. (I may actually do some work in this direction next year, because it's pretty cool stuff.)


I believe the "approximate computing" (also "inexact computing" or "soft computing") to be one area.

Computers always do exactly what you tell them, but that correctness is resource intensive. With the "approximate computing" you know you don't need an exact result because maybe you're showing it to a human who doesn't care about the difference between 3.40000 and 3.40001. What if you get that inexact result for half the power? (That's the level of claims I've seen).

That is similar to "analog computation" you might perform in your brain, e.g. recognizing a face is not 100% accurate.

Here's an article about it: http://news.rice.edu/2012/05/17/computing-experts-unveil-sup...

A bunch of more serious resources: http://shakithweblog.blogspot.dk/2012/11/approximate-computi...


Could you imagine the graphics that are possible with a one petaflop computer? Simulation status right?

I'm praying we (humanity) don't nuke ourselves to death during some stupid WWIII type situation before we get there


Carmack said we need 5 petaflops for truly realistic graphics.

But I'm not even sure if he considered 4k+ resolutions, 120+ FPS and 3D when he said that, because all of those may play a role too in the future if we want "Matrix-like" graphics in our virtual reality goggles or holodecks. So we might need orders of magnitude more powerful hardware than that still.


If you can track the eye with low enough latency you only need to render high res in the tiny fovea region, low res everywhere else.


Oddly enough, the military will be the first customer for a one petaflop computer for tactical and strategic scenarios/simulations, which will no doubt include WWIII type situations.


For the near future, it will almost certainly be used to comply with the moratorium on nuclear testing.

Over the last ~20 years many of the biggest supercomputers on the TOP500 list were purchased to run simulations of the aging US nuclear stockpile to answer maintenance questions that formerly required active testing.

https://en.wikipedia.org/wiki/Accelerated_Strategic_Computin...


Sweet, I attended a talk about this this spring in our university. The one big question was about the enormous energy density in these cubes. It must some way or another get into them, which seemed quite impossible to do. But, I for one hope this leads to some big leaps in performance that were previously not within reach for the foreseeable future.


Is it just me, or is this article written by someone who is clueless? I read the whole thing, and wasn't able to extract a single piece of genuine information.

Either someone is gaming HN to make artificial up votes, or I need to find a different forum.


This article was not written for any one who has even an average understanding of the subject. It is there to give the most simple, general over view of an interesting development. It is not supposed to be anything more than that. It is there for shelf stackers, van drivers, policemen, accountants, solicitors, carpenters, mechanics, in fact, any one but people with any clue to start with. People with a clue are doing themselves down by reading it in the first place. They should already know all about it, and not be blind sided by a BBC piece.

No, the person who wrote it is not clueless, they have written it for the clueless.

And that, is absolutely NOT a value judgement on the "clueless".

The vast majority of people very understandably do not have any clue about this niche science. Why on earth should they? After all, does being a whiz with Java or something suddenly mean this should even begin to have any understanding of this? Why should it? So why should a brick layer have any idea at all? All the average "clueless" want is a very, very basic over view, such that they can essentially say, "oh wow, that's cool. Oi Dave, look at this". And if they are even slightly inspired, they would go off and drill down to the proper scientific detail else where, using something revolutionary like Google.

If you want any more than that, then I'm sorry, its not the remit of the BBC. And no body even close to knowing the basics of this sort stuff would be getting their information from the BBC. Like I have said, they will already be very clued up, right? And one in the middle will not even read the article fully. They will get enough information to go to a more thorough source.

I get really frustrated with people who seem clueless themselves about what the BBC is there for, and who articles like this are aimed at. It is always easy for some expert to slag off BBC articles, in the way you have, when they were never ever written to stand up to peer review or some such high standard. I imagine an "expert" could easily pick holes in literally every single article the BBC has ever published.

What is really depressing is that you have obviously read enough to be interested. But instead of being interested enough to search for more information, you come here to lay in to the author. Is it really that much easier to complain than research? Does the BBC have to spoon feed every one, on every subject, at every level? No. Its a broadcaster, not a collection of all the worlds best universities. The let you peek in, the rest is up to you.


You are not alone in that opinion. From what I could glean, they are trying to address power distribution, and conversion efficiency, in large scale computer systems by integrating a Vanadium redox battery[1] directly into the chip. (And exploiting the cooling effect of the flowing electrolyte) It is a neat idea, badly described.

[1] http://en.wikipedia.org/wiki/Vanadium_redox_battery


I don't understand this remark. I think they describe the 'why' and the 'what' parts fine for what I think is their target audience. They even mention the 'how' part:

"IBM is looking for a fluid that can multitask.

Vanadium is the best performer in their current laboratory test system - a type of redox flow unit - similar to a simple battery.

First a liquid - the electrolyte - is charged via electrodes, then pumped into the computer, where it discharges energy to the chip.

Redox flow is far from a new technology, and neither is it especially complex."


I couldn't distill a single piece of actual information either. It was a bad read and I stopped at 1/4.


The BBC's science journalists are ok at journalism. Shame about the other part of their job descriptions.


While this may be a huge improvement in computer power/space occupied compared to today's classical computers, since nature tends to "compute" stuff so much more efficiently than our PC's, it still feels like a pretty transitional phase to me versus quantum computers.

We need to learn how to compute at the sub-atomic level with mind-blowing efficiency (trillions of trillions more operations for insignificant power consumption). Once we learn how to do that, we'll be the masters of our galaxy, or potentially even the universe.


  The art of liquid cooling has been demonstrated by Aquasar
  and put to work inside the German supercomputer SuperMUC
  which - *perversely* - harnesses warm water to cool its
  circuits.
How is it perverse to cool hot circuits with warm water?


The efficiency of cooling is proportional to delta t, so it's much better (for the computer) to cool a computer with cold water. But how did you produce that cold water? Probably with some energy-intensive refrigeration process. Thus AFAIK it's more efficient overall to use a higher inlet temperature (presumably with a higher flow rate).


"Their vision is that by 2060, a one petaflop computer that would fill half a football field today, will fit on your desktop."

Will it be able to run Crysis?

But seriously, I'd be even more interested in how these computers will be programmed. Any wild speculations?


Abstraction will be the answer. We will need very, very high level domain specific "programming languages".

Problem is: I feel like we are abstracting software slower than computers are getting faster. Is Javascript really that much more abstract than machine code (assembly)? I think not. In the long run, what we do now are baby steps.

We are still telling computers exactly what to do. Every step needs to be spelled out. And we're doing it by text files...


Think about the huge amount of computing power currently used to run gmail on millions of web browsers, servers, and apps. It could be viewed as one big computer program that just happens to run different parts on different devices. The benefit of separating it into parts is that it gives the program close proximity to different data. A massively parallel system would have to divide data into chunks that are separated by latency and bandwidth limitations in a similar way. The language abstractions are less important than the unavoidable physical limitations. Surely the event driven approach of Javascript is a good solution to this latency problem?


While domain specific languages are important , i believe most abstraction today happens at the library/tool level. Yes, some of them use text files and exact instruction, but are still high level.

And with machine learning , we don't even spell every step.


And yet, there are so many layers of abstraction under everything that it all still feels no faster than it all di in the late 90's...

We don't need to develop extremely high abstractions yet. Not until all the layers below are optimised.


Most people will use some hacked-on C extensions. They'll waste most of the performance, but the inner loops will be fast, and this approach will do better on benchmarks (which involve a small amount of heavily-optimized code).

A few will use Haskell/Erlang/etc. and make better use of the performance.


I loved this comment.

I'm pretty sure it's a (half) joke...


Simple answer. Star Trek like. NLP + AI + Data-Flow-Based Programming.

☛ "Computer make a sandwich, with extra cheese!"

    • 1:Computer analyses the command, reverse-engineers the receipe on an atomar/molecular level.

    • Femto-lasers a nanostructure to the replicator using computer models

    • 3D Prints synthetic biomimetic chemical protein and nano-textured (for color) carbohydrates

☛ "Computer make a new sandwich, this time more piquant!"

    • Computer this time tries to optimize the taste based on personal data and global taste data

    • GOTO: 1

--

Most of this can be done today already


Brain-machine interface, obvi. That or strong AIs. :)



wow that looks insanely close to what I remember from Terminator.

Compare yourself: http://imm.io/1iQis


Does anyone know much about the practicalities of the chemistry behind this? I've almost finished looking at redox at high school.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: