It might not be clear to you but as a maintainer and member of the Foundation it's very clear to me. The DNF used temporarily assigned permissions (assigned under the premise that they needed authority to fix a CLA bot) to move several repositories to their own organization. This is unacceptable by any measure.
Technically no, they used their permissions to include the organization in the enterprise. But the more relevant question is why that matters. As I tried to point out above, this has essentially 0 impact so far; so people must be worried about some future impact. Which specific one? How does this move cause problems that otherwise would not exist?
Thanks! Yeah, we've made excellent progress and whilst we are still calling it an alpha it's a very, very solid alpha.
I know Nathanael and have chatted plenty with him over the years on the subject. We're trying to solve different problems really in the field and I really hope he achieves what he wants.
We've got pretty good performance so far ( faster than System.Drawing in many regards and we know how to improve it further) and a great feature set but we still have a lot to do.
I'm personally working with another chap on ICC profile support just now which is very complicated but will give us a full color management system once finished.
While I figure that out I'm hoping that some of the performance gurus out there will step up and offer to tweak the library where necessary.
Chat to us on github. Our path drawing public API's are very similar to System.Drawing so we may well be able to provide you with the same functionality.
Amazed this is trending. If anyone has any interest in image processing of performance in general please get in touch and give me a hand. Any help will be appreciated.
Great to see the positive comments here. ImageSharp is my baby and it fills me with a lot of confidence that it's something that developers both want and need.
Why would you expect negative comment? .NET Core is a solid platform for web development. It's easy to deploy and relatively fast compared to the usual Python/Ruby/PHP and co... only matched by Java(controlled by Oracle) and Go(which is obnoxious). .NET Core is growing at a fast pace, and it deserves it, C# and F# are phenomenal languages.
Thank you for your work! I used it with a client that had a System.Drawing impl. Just making thumbnails of uploaded images. It actually was our CPU bottleneck! And then it started crashing when load was high, making us drop or reprocess (which added more load and drops...).
With your lib and a few hours of work, all that disappeared and CPU went way down. Thanks!
On a side note, users should try to send gratitude to open source authors in general. I know it's hypocritical of me, since I didn't take the time to email you before this! I've done it a few times in the past and the replies I've received indicated that I was the only one to ever thank them! That sucks. I've written some stuff I know more than a few companies built entire products and services with -- I've received about 2 notes in 6-7 years. It's not much but just a bit of thanks really provided motivation.
Cool. I'm assuming you do a lot of matrix operations, so I'm wondering if you've come across the Matrix<> type in Math.Net?; It has plug-in 'providers' for concrete implementations, and there are providers that are backed by Intel MKL, OpenBLAS, ATLAS, CUDA, etc.
If you don't register a provider you just get plain .NET code running serially (though it may user Vector<T> in future versions).
I found that matrix multiplication of dense matrices using MKL was about 60x faster than 'plain' C# code (on a recent core i7 CPU). The Vector<T> class will get you only some of that gain - mainly because MKL uses the FMA CPU instruction (fused multiply and add) that is designed specifically for matrix multiplication, + lots of other optimisations that Intel can make that the .NET jitter can't/won't.
Great to see a more generic pure-.NET image processing library. When I was doing this 5 years ago we basically had AForge.NET which was focused more on correctness of algorithms than speed. I tried to leverage that, but instead ended up with a totally custom image processing pipeline instead -- oh the battles with the GC and unsafe code...
Out of curiosity what is your overall vision for the library? Are you looking for a System.Drawing high-level API or more of an OpenCV-like low-level API?
I'd say somewhere in between. I've started with a very high level fluent API since I see that as the most common use case but have exposed a couple of important properties that will allow more low-level operations (pointers etc).
I have also created a generic image pipeline Image<TColor> that uses all the packed pixel color models used in XNA and Monogame which make it useful in gaming for loading textures.
The dream is that eventually the community will help port a lot of the algorithms over from libraries like Accord, OpenCV and AForge to allow usage in more scientific scenarios. Some of them will fit in the fluent API most will not but the tools should already be within the library to allow porting and usefulness.
I do agree. When I spoke to one of the guys there a while back though they said they simply had too much on their plate at the time and were hoping the community would kick in and help.
Yes, I'm hugely impressed by how much the teams at Microsoft have accomplished in the time, and how they have managed to change the direction of the platform without making a big mess. .NET Core could never be feature-complete on version 1, and I think that the folks involved have been very candid about the fact that it may take a while longer before it is able to support everything that many real-world projects need.
Look. People are voting me down, probably because they disagree with me. Let me explain where I'm coming from.
There's two types of programming ecosystem:
1) The wild west world. where there's no good tooling (IDEs, debuggers, etc), all libraries are provided by hundreds of random individual developers and of varying quality, vital tools (such as build systems) are a constant juggle of whatever's trendy at the moment, etc.
2) The world where there's a single comprehensive framework of extremely high compatibility managed by a single party, where tooling is excellent, where backwards compatibility is of paramount importance, and where everything you write is on a solid foundation that's not going to move out from underneath you.
If you want environment 1), you have a ton of choices. You can use Node.JS, you can use Python, you can use Ruby, you can use GoLang, Java, etc. If you want environment 2), you have exactly one choice: .Net. And in a year, based on everything I'm reading about .Net Core, you'll have zero.
The only saving grace is, being Microsoft, I can be confident that the .Net 4.5 stuff will work for at least another decade. Even if it's not the "trendy new hotness".
Maybe I'm a freak outlier, but I much much prefer the old Microsoft that wouldn't even think about releasing the product until it was 100% complete, tested, stable. I think they're moving in the exactly wrong direction, and driving full speed away from everything that made .Net such a great platform in the first place.
-----
Every time I've had to use a language where a lot of functionality is provided by "the community", it's been a constant cascade of buggy and badly-designed libraries. "The community" doesn't test their stuff before releasing it. Or the library works for the one tiny purpose it was written for, but isn't generic enough to be useful to anybody else. It's nobody's job to ensure quality or completeness, so it simply does not happen.
I'd rather have good code, even if I have to wait longer, than code from "the community".