Hacker News .hnnew | past | comments | ask | show | jobs | submit | photon_collider's commentslogin

Reading this news only leaves me worried about long-term future of these open source tools.

I have long since found the VC model for open source questionable. If you are not selling popular enough direct enterprise support what is the model to actually make money.

Take ruff, I have used it, but I had no idea it even had a company behind it... And I must not be only one and it must not be only tool like it...


Fair point! When I first started on this I went down a deep rabbit hole exploring all the ways I could set this up. Ultimately, I decided to start simple with hardware that I had laying around.

I definitely will want to have a dedicated NAS machine and a separate server for compute in the future. Think I'll look more into this once RAM prices come back to normal.


Ah nice! Didn’t know that. I’ll try that out next time.



Nice to see a new release from Anthropic. Yet, this only makes me even more curious of when we'll see a new Claude Opus model.


I doubt we will. The state of the art seem to have moved away from the GPT-4 style giant and slow models to smaller, more refined ones - though Groq might be a bit of a return to the "old ways"?

Personally I'm hoping they update Haiku at some point. It's not quite good enough for translation at the moment, while Sonnet is pretty great and has OK latency (https://nuenki.app/blog/llm_translation_comparison)


Funny enough, 3.7 Sonnet seems to think it's Opus right now:

> "thinking": "I am Claude, an AI assistant created by Anthropic. I believe the specific model is Claude 3 Opus, which is Anthropic's most capable model at the time of my training. However, I should simply identify myself as Claude and not mention the specific model version unless explicitly asked for that level of detail."


Ah, I remember Grooveshark. I discovered so much good music there.


"Trust but verify" is still useful especially when you ask LLMs to do stuff you don't know. I've used LLMs to help me get started on tasks where I wasn't even sure of what a solution was. I would then inspect the code and review any relevant documentation to see if the proposed solution would work. This has been time consuming but I've learned a lot regardless.


Almost like keeping a personal write-ahead log. :)


Not quite. A write ahead log memorializes the final state of the outcome. It should be everything you need to recreate the result or net effect.

if you write it down before you start, it’s more like the intent.


Has Julia's popularity in scientific computing and data science continued to grow? I haven't heard much about it recently.


I use Julia regularly for experimental machine learning. It’s great for writing high performance, distributed code and even easier than Python for this kind of work, since I can optimize the entire stack in a single language. Not sure if it’s growing in popularity but it’s really solid for what it does


Me too, and I'd like it to become mainstream. The major problem right now is that it doesn't have anything that is close to Torch or JAX in performance and robustness. Flux et al. are 90% there, but the last 10% requires a massive investment, and Julia doesn't have any corporate juggernaut funding development like Meta or Google.

This is hurting Julia's adoption. The rest of the language is incredibly elegant, as there is no 2-language divide like in Python. Furthermore, it is really performant. With very little effort one can write code that is within 1.5-2x of C++, often closer.

One possibility is that something like Mojo takes Julia's spot. Mojo has some of the advantages of Julia, plus very tight integration with Python, its syntax and its ecosystem. I would still prefer Julia, but this is something to keep in mind.


LLMs massively compound the advantage of existing popular languages, namely python. Any new learner will find it infinitely easier to use sonnet 3.5 to overcome the so called '2 language barrier' for python, while the lacking data for Julia becomes the real barrier.

This issue will remain until LLMs get so smart they can maybe self-iterate and train on a given language. By then though, we'd likely get languages designed and optimized for LLMs.


To back up the sibling comment, I've found ChatGPT quite capable where Julia is concerned. It does hallucinate the occasional standard library function, but a) it gets it right after it's told it was wrong about half the time and b) Julia's documentation is fairly good, so finding what that function is really called is not a big deal.

It can even debug Pkg/build chain problems, which... Julia could use a bit of polish there. On paper the system is quite good, but in practice things like point upgrades of the Julia binary can involve a certain amount of throwing spaghetti at the wall.


There is a paper that claims that Julia performs the best in ChatGPT: https://arxiv.org/abs/2308.04477

Chris blog on that: https://www.stochasticlifestyle.com/chatgpt-performs-better-...


For what it's worth I've found Claude Sonnet to work really well with Julia.

One fun exercise was when a friend handed me a stack of well-written, very readable Python code that they were actually using. They were considering rewriting it in C, which would have been worth it if they could get a 10x speedup.

I had Sonnet translate it to Julia, and it literally ran 200x faster, with almost identical syntax.


More fragmented, calling-some-api based python code actually causes LLM to hallucinate and mix libraries more: https://www.stochasticlifestyle.com/chatgpt-performs-better-...


IMO Mojo's memory model is way too unfriendly for python programmers.


Could you elaborate? As far as I understand, if you treat it like Python (e.g. use defs and stick with the copy-on-modification default), you'll still see performance improvements without even thinking about memory.


I want to really like Julia. For me it felt like more work than python for simple stuff and not that much less work than c++ if you are trying to get the best performance. It is a cool language though.


It's reasonably popular, growth has continued at a slow but steady pace. It's never going to become Python or anything but it's great in its niche.

We use Julia in our hedge fund, it allows our researchers to write Python-like syntax while being very easy to optimize – compared to numpy code we've had a relatively easy time getting Julia to run 20x-1000x faster depending on the module, which has resulted in a very large reduction in AWS bills.


certainly yes in scientific computing, less so in ML/data science. there's much of the culture of scientific computing in economics -- lot of heavy numerical stuff in addition to the statistical modeling you might expect.


I think it has correctness issues


Source?



I think my general sense of this article is that all of these have been fixed. The language is relatively new, and the core devs are responsive. Using anything new comes with risks. I think the community appreciated a detailed and generally well-reasoned diagnosis, but at the same time these things are relatively easily addressed.


This link lays out the case in an exceptionally thorough and damning (for Julia) way.


isn't julia just another brand-name (™,®, and ©) python? like anaconda? or possibly the R language???


R is an open source version of S, which was a competitor to SAS.

Julia, from when I looked at it years ago was trying like a new version of Matlab or Mathematica. It was very linear-algebra focused, and were trying to replace those packages plus Fortran. They had some gimmicks like an IDE that would render mathematical notion like TeX for your matrices.

Python wasn't the obvious "Fortran killer" scientific language it is today. In fact it's arguably really weird that Python ended up winning that segment. In any case, I think Julia's been struggling since its inception.


R and S are also very linear algebra focused. R developers just try to make C++ behave like R as much as possible when they need more speed. Hence, Rcpp. Otherwise, we prefer our LISPy paradise.


I was in Austin while Travis Oliphant's wave from numpy led to Anaconda. After that we got to bring them in as consultants. It was wild talking to the team and hearing the inside-track dev info. It isn't a surprise to me that Python, as flexible and glue code as it is, became the Excel language of Scientific Computing.


What kinds of things did you hear from them?


Mostly the vision and ideals which became Anaconda, conda, and miniconda, as well as the translation of ideas to use cases to implementations, and some ideas that came about later in other forms or libraries (numba, pytorch).

Basically a mini/beta/in-progress version of Pycon each week.


Not at all? Totally different programming paradigm and performance. Certain communities pull towards Julia a lot more than others. Mostly I've seen scientific fields that require HPC but don't want to do everything in FORTRAN and C. Paging Chris Rackauckas!


Julia feels like a Matlab++ with its one based indexing and `function end` syntax Mojo is what you're thinking of


I primarily use MATLAB and what stops me from using Julia is the package management.

Also the VSCode extension has weird performance problems when trying to debug Julia code.


> what stops me from using Julia is the package management.

Can you expand on this? Julia's package manager IMO is one of the best parts of the language.


I hate with a burning passion having to manage packages myself. MATLAB comes preinstalled with everything and the kitchen sink.


Fair enough. It probably would make sense to have a Conda like release of Julia that comes out every year with a broad but curated selection of packages.


Long ago, I trawled through Matlab's docs to come up with a set of Julia packages matching what's build into Matlab: https://discourse.julialang.org/t/julia-for-matlab-users-clu...

I don't think you'd actually want to include each of those packages in a standard distro: does the average user really need to programmatically send emails or deal with Voronoi tessellations? Probably not, but I still think there's value in a batteries-included approach, especially when working with students.


No. It's not.


Astro is really good for this use case. Also easy to host on platforms like Netlify.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: