HN2new | past | comments | ask | show | jobs | submit | mrgoldenbrown's commentslogin

They can try it on desktop and web for free if they are that skeptical.

Can you imagine, a lot of young people don't have desktop/laptop, only mobile.

Aka "dusting"

So much of the modern web is designed in service to ad revenue. If you optimize a website for usefulness you are losing ad revenue. Every time we force you to click to see another month/week/paragraph we make money.

Like a lottery only advertising the winners.

"Without discernment" is implied in "regardless of whatever obstacles are in the way". Many of us would consider excessively exploiting others to be an obstacle.

For doctors in the US I think the limit is more artificial than that: a cap on how many med school seats are allowed.

The captive audience after some (mostly) arbitrary grinding,

(not being drawn to serve the niche by any particular talent or interest besides $),

is the comparison being drawn.


Home page states Lix can diff. "any file format like .xlsx, .pdf, .docx"

Wow, sounds useful. Git doesn't do that out of the box.

BUT... the list of available "plugins" only has .csv,.md and json, which are things that git already handles just fine?

Can it actually diff excel and word and PDF or not?


It can but the plugins are not developed for production readiness yet. I should clarify that.

The way to write a plugin:

Take an off the shelf parser for pdf, docx, etc. and write a lix plugin. The moment a plugin parses a binary file into structured data, lix can handle the version control stuff.


IMHO the main point of the article is that typical unix command pipeline pipeline IS parallelized already.

The bottleneck in the example was maxing out disk IO, which I don't think duckdb can help with.


Pipes are parallelized when you have unidirectional data flow between stages. They really kind of suck for fan-out and joining though. I do love a good long pipeline of do-one-thing-well utilities, but that design still has major limits. To me, the main advantage of pipelines is not so much the parallelism, but being streams that process "lazily".

On the other hand, unix sockets combined with socat can perform some real wizardry, but I never quite got the hang of that style.


Pipelines are indeed one flow, and that works most of the time, but shell scripts make parallel tasks easy too. The shell provides tools to spawn subshells in the background and wait for their completion. Then there are utilities like xargs -P and make -j.


UNIX provides the Makefile as go-to tool if a simple pipeline is not enough. GNUmake makes this even more powerful by being able to generate rules on-the-fly.

If the tool of interest works with files (like the UNIX tools do) it fits very well.

If the tool doesn't work with single files I have had some success in using Makefiles for generic processing tasks by creating a marker file that a given task was complete as part of the target.


Having played enough video games that use joysticks for steering I don't want to drive a real car with a joystick. Crashing in Mario kart or Grand theft Auto because I sneezed is fine but not in real life.


Exactly. The control needs to have both an intentional and major motor movement from the driver. Modern steering wheels have the same benefit as the original iPod wheel. Easy for small movements, even accidental ones; possible for big movements.

Also funny that they had the ability to swap to the passenger to drive it. So acceleration/break for one person, steering for another? Really not a good idea.


surely if NYC can do 99c slices, it's not as simple as high rent costs?


The day of the dollar slice in NYC is dead, most places have switched to $1.50 slices.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: