Hacker News .hnnew | past | comments | ask | show | jobs | submit | yssrn's commentslogin

A password manager doesn't need frequent updates. Instead, the priorities should be reliability, UX responsiveness, and light resource usage.


...and security, which absolutely requires frequent updates. A password manager should be updated all. the. time. on this factor alone.


That's achieved with frequent updates.


Unfortunately, that ship sailed when they took VC funding.


Scary set of videos. Is Tesla using Marylanders to train FSD?




Virtually all of these changes reduce contrast and differentiation between controls while simultaneously decreasing information density. All bad news for people who work on their computers for a living.


I like borders on my buttons to know what is a button and what isn't, this feels.. backwards...

So much has to be inferred from the way things used to be, makes it harder to argue that it's the "simpler" operating system designed for Humans with a keen eye on UX design..

FWIW I also think the flat UI on iPhone (which has been prevelant for >6y now) is a horror show.

Steve Jobs famously once said (while working at NeXt): "There are two kinds of people at Apple: Those that want to push computing forward, and those that want to be the Sony of computers.. and the Sony guys are winning"


I remember the transition from iOS 6 to 7. It definitely initially looked modern but I still sometimes have difficulty distinguishing between a label and a button.


Another common issue is the lack of scroll bars or other UI hints like shadows means it's often impossible to know when/if you can scroll.


Though Apple isn’t relenting on this front in the default experience, you can change the appearance in Accessibility options (look for Button Shape). That could help keep these distinct and easier to recognize.


How on earth is moving to a the ARM architecture not pushing computing forward?

macOS moving to ARM is the biggest leap for developers in recent years - Amazon are also developing their own ARM chips and cloud services running linux ARM is the future.

Having a mac with a unix-like terminal environment & support to run linux arm natively when devloping for those services is a huge deal.


This is not the ARM thread, This is the UI thread. Nothing about the UI is predicated on ARM.


First of all, this off topic. Next, it's simply wrong, there is no ARM processor better than the best x86 processor in any category except insanely low power. All comparisons I've seen so far compare tiny mobile processors with huge server processes with orders of magnitude more I/O per watt. When you compare phone chips with Zen 2 low-power laptop chips and take into account the massive I/O disparity it's much less favourable.


I thought the “white everything” UI approach of metro apps under Windows 10 was bad enough but this is even worse with all the bluriness and similar colours. I am going to have to get another job soon else I’m going to go blind with all these “improvements”. I’m not joking.


i've had "increase contrast" (and implicitly "reduce transparency") enabled since whenever I started using macOS. I really hope those accessibility options work well to reduce whatever changes they're trying to bring about.

on the same note; the"decreasing information density" I don't think would be a problem if you're switching between the "terminal/IDE/browser/excel". the "table views" still seem fairly "compact" so..? /shrug


They can simply get rid of the close/minimize/maximize buttons at this point. I wonder just how many people actually ever use them.


At this point they’re just about the only thing that tells you whether a window is in the foreground or not.


The funny thing is that "USB4" doesn't have a space in the name while "USB 3.2", which they just rebranded everything else as last week, has a space. It has to be some kind of magnificent troll.


It's almost as if a secret RS-232 spy got onto the naming committee and is doing everything possible to derail the standard and bring back the parallel port.


But RS-232 is serial. Maybe the IEEE 1294 guy joined the committee?


It's not just you — it's virtually everyone who will be purchasing a "USB 3.2" cable in the future!


USB-IF has to be doing this to deliberately confuse end users, right? I get that it's a specification number, not a product name, but they have to know that this will only cause problems for customers trying to find the appropriate cable.

Incredibly, their language usage specifications doc https://www.usb.org/sites/default/files/usb_3_2_language_pro... begins with this:

> USB-IF emphasizes the importance and value of consistent messaging on USB product packaging, marketing materials, and advertising. Inconsistent use of terminology creates confusion in the marketplace, can be misleading to consumers and potentially diminishes USB-IF’s trademark rights.

I simply don't understand USB-IF's motivation to make this so confusing for everyone. Their board consists of Apple, HP, Intel, Microsoft, TI, Renesas, and STMicroelectronics, so it isn't like it's controlled by low end trashy cable manufacturers trying to make a quick buck from confused customers.


If I were being cynical, I'd say their motivation is to get people to throw up their hands and say "I don't understand USB any more! Just give me the latest version of Thunderbolt. I know it supports the highest speeds and highest power, including displays and eGPUs, and all USB devices, and works on any new Mac."


It allows manufacturers (yes, many of the same ones on the USB-IF board) to "upgrade" their products by printing another label. USB-IF doing this repeatedly while being fully aware of the consequences can't have any other explanation.

This is just like the 4G-5G-5Ge debacle. Both being motivated solely by financial gain.


> This is just like the 4G-5G-5Ge debacle. Both being motivated solely by financial gain.

You are forgetting 4G LTE... ^__^;


The motivation is the same for re-branding 4G-LTE as 5G (or 6G). Sell the same thing with a new name so customers are tricked into thinking you're selling something newer and better based on the spec sheet without context.


The 5G thing is the carriers's doing, not the standards body.


It’s like they created Long Term Evolution (no more ‘G’enerations) then immediately realized the marketing issue.


So is there going to be a generation of young people who grow up thinking that all peripherals are USB, and that the naming system is so complicated/nonsensical that it isn’t even worth trying to figure it out?

When I was growing up, we had USB, FireWire, and a couple size/speed variations. It was easy to understand and you could tell what a cable was by looking at it.

For people who grew up in this era, the current situation is super annoying. Perhaps the young people of today will view peripheral standards as some sort of super-obscure language that isn’t intended to be decipherable by laymen?


Yes, and they will be forced to fall back on heuristics like "newer / more expensive / brand specific cables tend to fail less."

Muddying the waters is a diabolically genius way of attacking the generic cable manufacturers.


You kids had it so good. In my day, we had about 15 different peripheral busses that all ran over RS-232C or DB9 cables. You could fry a $10000 printer by plugging in a cable that fit perfectly on both ends.

And we felt damn lucky to have them.


So far all of their moves in regards to USB 3+ and type-c seem to have been to cater to various consumer companies.

If we put it in that context, then this new re-branding has to be done for the same reason - so that the same companies can claim to support a "newer" USB standard (and thus a reason for you to upgrade to the new devices), even though nothing has changed.


They seem to go out of their way to find perverse naming. Apart from 3.0/3.1/3.2 there’s the highspeed/superspeed nonsense. I can never keep any of it straight.


Off topic, but this is how I see your quote btw: https://i.imgur.com/dIFTp4Y.png


When quoting, please prefix lines with > instead of using preformatted text: preformatted text for blocks of prose is horribly unusable, everywhere.


Unfortunately, HN is so bad at publicizing and making this easy. I spent years assuming others knew some magical tool I didn't, before realizing people just do everything manually and stick to the same unwritten style guide.


...it felt pretty obvious to me? ">" is pretty commonly used, in email for instance.


agreed, I just thought I was missing some hidden feature. Like, I have to do the line breaks myself? I guess?


It’s even better to also italicize (using asterisks), e.g.:

> USB-IF emphasizes the importance and value of consistent messaging on USB product packaging, marketing materials, and advertising. Inconsistent use of terminology creates confusion in the marketplace, can be misleading to consumers and potentially diminishes USB-IF’s trademark rights.

Then at a glance there is a distinction between the comment itself vs. quotations.


Apologies, I read the site every day but post so rarely that I forgot that's the consensus way to block quote. Thanks!


Drives me nuts! For everyone on a phone:

> USB-IF emphasizes the importance and value of consistent messaging on USB product packaging, marketing materials, and advertising. Inconsistent use of terminology creates confusion in the marketplace, can be misleading to consumers and potentially diminishes USB-IF’s trademark rights.


The IRS has a $11,526,389,000 budget. Their Information Services division has a $2,237,659,000 budget, and 6,089 employees.

So the issue isn't a lack of funding or personnel, it's bureaucratic incompetence.


Your comments includes a logical non-sequitur. You have to prove that those numbers are sufficient, large absolute numbers don't necessarily mean anything in isolation. I'm not sure how to do that, but just pointing to large numbers doesn't make an argument.


I can't speak for the IRS specifically, but I can speak for a few other agencies. The problem is most certainly not incompetence at the individual level-- I've met some of the brightest, most creative people in government IT.

The problem is an incentive structure that rewards constantly adding scope to projects. There are a lot of ways to think about why, but I tend to break it down to two things: Economics and culture.

First, economics. Money is allocated through the federal budget. This budget does not obey the normal laws of physics. There are myriad reasons why (down to the fact that the government can just go ahead and print new money sometimes), but the important thing to understand is that it's harder to remove lines from the budget than it is to add them.

Private industry has a wonderful garbage collection mechanism called "going out of business". In the public sector, projects have a tendency to keep existing. It's not because people in management want to sit around and collect an easy paycheck-- It's because they often have zero control over where the money goes. That ability belongs to congress.

Second, culture. And here, I actually am going to do some finger-pointing. Moving up in the hierarchy almost always means controlling more personnel, and vice-versa, having more people under you means you're higher in the pecking order. If you suddenly find yourself in charge of a 500-person department, well, you might be in line for a promotion to SES. This leads to a strange territorial game where high-level management has no incentive to shrink the size of projects.

The hierarchy tends to self-enforce, notably through the way it communicates[1] (Always Outlook, rarely any type of tool that allows easy communication between management and staff). This is made exponentially worse by the fact that written correspondence can and often does become public record, making it even harder for people to talk openly and honestly with one another.

I think the budget problem is unsolvable, at least by us Silicon Valley types. But one area where we can contribute is project management. Thinking about projects in terms of actual deliverables (SLAs, APIs, usage levels, etc.) and flattening communications channels would really solve a lot of problems.

[1] http://www.melconway.com/Home/Conways_Law.html


Having been part of a company that depends entirely on seasonal business (christmas), I would understand why the IRS is unable to complete technical projects.

You spend all year working on a system, then the time comes for you to demonstrate it's capabilities. It is 2 months to BIG_IMPORTANT_SEASON. A few faults are found in your solution. Not wanting to risk downtime during literally the only time your company matters, your superiors put your system on hold and you get relegated to support work.

The IRS probably works the same way. I am guessing the IT division gets about 6 months to do their work before everyone stops the presses because it's tax season and you don't fuck with stuff any time around tax season.


Fair point. But what about companies that create tax preparation software, such as TurboTax? That could be considered "seasonal" too, yet they've made great improvements over the years. A different scale of course, but same general concept. (Developing seasonally-used software)


Tax preparation isn't seasonal. Some people and companies have to do taxes quarterly, not yearly. Some taxes (payroll, sales tax, use tax, etc... in some states) have to be done monthly.

There's also all the extensions that get filed each year.


The IRS isn't limited by raw money. It does not have the authority to hire whomever it wants, to invest money in whatever it wants, or to set its own salary levels. That all requires approval by Congress. I wouldn't say that a lack of communication between the IRS and Congress is a sign of "bureaucratic incompetence".


And they're responsible for the collection of over

$3,180,000,000,000 annually

So I see no validity to your point(s).


Uh, no. The IRS's lack of budget to actually accomplish its goal has been a problem for quite a while now. They can't even afford office supplies, and they surely cannot afford to hire the people they need to accomplish their goal.


Wow. That's larger than the budget of many US states... let that sink in for a minute.


Many states have populations smaller than a single medium-sized city, so that shouldn't be that surprising.


How about this, then... I live in North Carolina. North Carolina is one of the 10 biggest states in the country (actually, #10, I think). 10 million people live here, consuming government services all day - healthcare, roads, parks, welfare checks, etc.

North Carolina's annual budget for providing all of those services is $23 billion - only twice the budget of the IRS.


> North Carolina's annual budget for providing all of those services is $23 billion - only twice the budget of the IRS.

NCDOR—the North Carolina analog of the IRS—has an annual budget of $177 million, compared to NCs total annual budget of $23 billion, or about 1/130 of the budget of the government it gathers revenue for.

IRS budget is $11.5 billion, compared to the US annual budget of $3.3 trillion. The IRS costs 1/286 of the budget of the government it gathers revenue for.

If you are going to try to say that NC makes the IRS look too expensive for what it does, you are going to need to do more than just point at the size of the state budget to make your argument.


They also have a similar number of employees: IRS - 76,832; NC State - 83,820


This is the only case in which I'm okay with bureaucratic incompetence.


You still owe them the same amount; their delay is just an adds uncertainty to the process.


Incompetence here creates an un-level playing field where the unscrupulous and wealthy can game the system to their own benefit.

As non-intuitive as it sounds, you really do want competence here.


Yes! Because paying taxes is so un-American /s


This would be great... if taking (and faking) screenshots wasn't possible.


The character limit is a core aspect of what makes consuming content on Twitter so engaging vs. other platforms.

Will this change improve engagement number or ad revenue? I sincerely doubt it.

An actual useful change would be versioning functionality so that users can edit tweets to correct typos or factual errors while not erasing the record.


Twitter will not implement the ability to edit tweets. tweet text being immutable is one of their data model core tenets:

https://www.quora.com/When-will-Twitter-offer-the-option-of-...


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: