Hacker News .hnnew | past | comments | ask | show | jobs | submit | ch_123's commentslogin

I find this story odd because IBM was consistent with their keyboard nomenclature across multiple products, and the 3270 series mainframe terminals used the Tab key, located in the same place where you would find a tab key on a modern keyboard, to move the cursor to the next field.

https://www.bitsavers.org/pdf/ibm/3278/GA27-2890-4_3278_Disp... (Page 73 of the PDF)

As an aside, it's worth noting that moving between fields was important enough on IBM terminals that they had a dedicated "back tab" key located on the opposite end of the keyboard to the tab key. On the original IBM PC, they decided to combine both functions into a single key. As a result, the tab key on the classic PC keyboard features the symbols for both forwards tab and back tab on the same key, the back tab symbol being on top to indicate that you need to hold down shift to use that function.

EDIT: The 5250 series terminals used the terms "Field Advance" and "Field Backspace" instead of Tab and Back Tab, but otherwise they used the same symbol on the keys, and the keys were located in roughly the same position as the 3270 series. Reference: https://www.bitsavers.org/pdf/ibm/5291/GA21-9409-0_5291_Disp...


Here's a real IBM 3270 keyboard.[1] Note the "Next field" key on the left, and the matching "Previous field" key on the right.

The IBM 3270 was a device for filling up forms. The mainframe sent the terminal a form with blanks, and the terminal let the user fill in the blanks. The terminal hardware prevented the user from overwriting the static parts of the form, and could apply some other form constraints, such as numeric fields. That was all done by the terminal. When the form was filled in, the user pressed ENTER, and the completed form was sent to the mainframe as one transaction. This approach let one mainframe service huge numbers of terminals. The user never experienced delays while typing and could type at full speed, often without looking.

PCs didn't have that usage model. The PC crowd was thinking "typewriter". One of the first terminals for home computers was called the "TV Typewriter".

Web forms do have that model, but with less consistency.

[1] https://sharktastica.co.uk/resources/images/model_bs/themk_1...


My favorite feature of 3270/5250-style keyboard layouts is the separate carriage return and Enter keys, allowing for multiline text entry without special handling to avoid conflicts with the command to signify that input is complete.

With only a single combined Enter/Return key, it's hard to remember in any given context whether Shift+Enter or Control+Enter will open up a new line instead of immediately sending a message, dismissing a dialog box, completing input into a particular spreadsheet cell, editable filename, text object in a drawing program, etc., or whether I need to copy/paste a line break from another application because no such shortcut exists at all.


The software I've used has been pretty consistent about Shift+Enter meaning "new line without triggering the Enter key action" and Ctrl+Enter meaning "action" (send the message, go to the next field, etc). The behavior of the Enter key varies from program to program, but I have yet to find a program where Shift+Enter triggered an action while Ctrl+Enter did not. Which software are you using where Shift+Enter triggers an action?

(I do agree that it's hard to remember what Enter is going to do, of course. It's just Shift+Enter where my experience differs from yours).


Yeah, the separate Enter and Return keys on Macs has always been nice.

The SAP application model is such a form-based model (no surprise given that all five co-founders of SAP were ex-IBM consultants that were fired for moonlighting - specifically, for writing a payroll software for chemical giant ICI in assembler on ICI's mainframe in an extended night action...).

SAP call their forms "dynpros" (dynamic programms), and the reslting interactive mode of processing "realtime/dialog programming" as opposed to "batch processing". This all looked very IBM 3270-"inspired" (and so was the SAP logo made up of IBM blue with the well-known stripes...).


Nitpick: The terminology used by IBM on the 3270 family (including the 3277 whose keyboard you shared) was "Tab" and "Back tab", not "Next field" and "Previous field".

Yes: Page 30 of The Operator's Guide for IBM 3270 Information Display Systems calls it a Tab key.

   Tab also has typamatic capability that allows you to move the cursor quickly from field to field.

  The Back Tab key moves the cursor back to the first character position in an input field. If the cursor is already in the first character position of an input field, and if you press the Back Tab key, the cursor will then move back to the first character position of the preceding input field.
So my guess is that the cursor defaults to being at the start of a field as you navigate, so Tab and Back Tab work as expected. But if you're editing a field and have moved the cursor within the field then Back Tab acts differently.

https://usermanual.wiki/Document/GA2727421OperatorsGuideforI...


This is correct. It's also worth pointing out that the 3270 defaults to overwrite rather than insert mode, and has an "Erase EOF" key that deletes all text from the cursor position to end-of-field.

Note as well that a screenful of user-entered checked/constrained text, meant for some form of database query or insert, meant just one interrupt to the mainframe CPU; and all the info was there in an easy to parse format. Very low use of resources.

It looks really strange to have 3 keys right next to each other all pointed in the same direction.

As another head scratcher, what is the shift-1 symbol? The exclamation point appears to be the shift next to one of the 3 left arrow keys, but I'm also unfamiliar with the regular unshifted key. Anyone familiar with these?


It's the | symbol.

On later generations of IBM terminal keyboard, you'll see | on the shift-1 position, and a separate key with the broken-bar (¦) symbol. For example, on this keyboard, the broken bar is below the backspace key along with the \ character. https://sharktastica.co.uk/image?id=qhTU8QvD

The reason for the two different types of bar/pipe characters, and why the original IBM PC keyboards only had the broken bar on the keyboard, involves a particularly arcane footnote of history relating to supporting the PL/I language on ASCII terminals: https://www.os2museum.com/wp/a-wunderbar-story/


I had never come across the broken bar before. I feel sorry for those that had to suffer these keys. The keyboard keys themselves look like they have about a mile of travel which was probably exhausting as well. I learned to type on clackity clackity arm typewriters, and those keys were I assumed designed by a masochist. These look even worse

The early 3270 keyboards (and the other IBM keyboards from the mid 70s up to the early 80s) are some of the most pleasant keyboards I've typed on in terms of key weighting and tactile feel. The length of travel is comparable to modern mechanical keyboards. The downside is how tall and aggressively angled the keyboards are, which are very far from modern ergonomic standards.

Those beam springs had the second-best feel ever, after the Selectric. Then the Model F cost-reduced (and thickness-reduced) it, and the Model M further.

> The downside is how tall and aggressively angled the keyboards are, which are very far from modern ergonomic standards.

Like a typewriter. And like a typewriter, you were expected to have them on suitable furniture so the keytops were at the right height.


Arrow keys are giving swastika vibes, which I guess fits given IBM’s history

Having worked at IBM, I would guess that using the tab key in this way was part of a patent they were pursuing and Microsoft's use would show this to be 'obvious' and thus not patentable. But that is just a guess.

In the 80's IBM had a whole class of high level technical people called "Systems Engineers" whose entire job description was to opine on the merits of any given system. Not write systems, not debug them, and certainly not to explain them, it was simply to opine "you're doing it wrong."


Microsoft is suffering from the lack of such a group today; they're definitely doing it wrong, where "it" is pretty much everything... except pissing off users.

Microsoft could implement the "Am I doing it wrong?" check via the shell script `/bin/true`

They’d need to install WSL2 first though!

It's a Linux subsystem for Windows so we'll call it Windows Subsystem For Linux.

For trademark safety, this is the correct approach. You can say "Blah for XXX" and that's fine but if you say "XXX blah" then you can get into trouble.

Is this really true, or is it just something people have repeated enough times like 'nuclear Ghandi'?

I mainly ask because Microsoft has another product called Linux Integration Services: https://www.microsoft.com/en-us/download/details.aspx?id=551...


It's not a rule, it's just convention. Trademark law is about whether there is confusion about who made the product but not specific wording. Using "Blah For XXX" wording just makes it clearer.

"Tool for Windows" vs "Windows Tool"

The latter sounds much more like it could come from Microsoft. People repeat this because it avoids this confusion but it is not mandatory. A few projects on Github have had to be renamed because they've been challenged and the accepted solution from the trademark holder has has been to switch it around and become "for XXX".


In the present context, I'm reminded of IBM's "OS/2 for Windows", which, while actually a reduced-price version of OS/2 2.1 that used a customer's existing copy of Windows 3.1 to avoid the cost of licensing the Windows 3.1 components IBM shipped with OS/2 to support Windows compatibility, was also a marketing ploy to reposition OS/2 as a Windows enhancement rather than a replacement OS (which, to be fair, is not as misleading as it may sound, since OS/2 2.1, unlike Windows 3.1, is capable of memory protection and preemptive multitasking between Windows applications).

That makes sense, although in that case personally I would have named it Windows Linux Subsystem.

As a lawyer trained in trademark law, I've never heard this. Do you have any references?

What if you just apply for a free sublicense and you get approved and your massive cadre of attorneys aren't fighting each other over 5 letters?

Apparently some HN people think that MS is so sleazy that they will just go "GPL yoink" and start running/advertising/supporting Linux without notice or consent the benevolent dictator. That's projection.


To solve this matter, I propose renaming it to WNL:

  WSL is not Linux

For a recursive acronym, I prefer LiNT, officially LiNT is NT, and unofficially, either Linux in NT or Linux is Not There, with the official and second unofficial definitions reflecting the WSL 1 architecture where WSL, like Win32, is a subsystem layered on top of the NT kernel, and doesn't rely on any Linux kernel code.

It's not a part of Linux, so it can't be a Linux subsystem. It is a part of Windows, so it is indeed a Windows subsystem. Also it started/replaced a part of the NT kernel called a Subsystem, so it is called a Windows Subsystem.

I don't know where that strong objection to this particular name comes from. MS does do weird things with names, like with Live, .Net or CoPilot, but this isn't one of those, but in fact named quite sensible. Would you also object to the 'Linux kernel module for Android'?


Use of the English language can often lead to ambiguity.

In the case of "Windows subsystem for Linux", it can be reasonably read that "for Linux" means that the functionality applies to Linux, i.e., to provide it with a subsystem that gives it Windows functionality.

Similarly, in the case of "Linux subsystem for Windows", it can be reasonably read that "for Windows" means that the functionality applies to Windows, i.e., to provide it with a subsystem that gives it Linux functionality.


it’s a windows product therefore windows comes first in the name. at least that’s my recollection of the reason why i’ve seen before

Wasn't that due to trademarks?

A TRUE: device?

As long as I can plug in a serial console and it endlessly spits out 1s!

> I would guess that using the tab key in this way was part of a patent they were pursuing and Microsoft's use would show this to be 'obvious' and thus not patentable.

Something that's bothered me about user-facing patents:

Let's assume that the idea of using a keyboard key to move between input fields in a software form is not obvious, and in fact is a brilliant stroke of genius the likes of which the world is not likely to see again. If that one guy hadn't been born, we would have gone thousands of years with no method, keyboard-based, mouse-based, or otherwise, of moving from one input field to another input field. Every piece of software would use nonconfigurable timers, and you'd just have to hope you could type fast enough.

I don't see what the hypothetical benefit of extending patent protection to this brilliant idea is supposed to be.

Say you're the company who comes up with the idea. You can benefit by including it in your product, where all your users can see it. In other words, the benefit you get from coming up with this idea is that you can publish it for the world to see, and that's the only way you can benefit from it. A usability feature that your users cannot use or know about doesn't increase usability.

Even though the idea isn't obvious, the implementation is. If you disclose your brilliant idea, everyone will copy it and your advantage in the marketplace will be transitory.

So... what is the purpose of giving you a patent? That cripples the marketplace, but it fails to realize the benefit of patents, publication. Publication necessarily had to happen anyway.


Err... wasn't your post a perfect example of why patents exist?

The concept probably has a real name, I call it first mover disadvantage. It is much easier to copy a mechanism than to invent it. So why even try? Every thing you have to spend real effort to invent is trivially copied the instant you try to sell it. And them copying it don't have to bear the nearly the R&D expenses you did. so it is trivial for them to sell this mechanism for less meaning you don't even get a fair slice of the pie.

So to try and limit this imbalance we invent a legal fiction, ownership, not of a physical thing, but the way it works. Not forever, but for 20 years you get ownership over those works.

Patents do have their problems, But I think the core idea is sound, create a registry of mechinisms, use this to provide economic protection to the inventor.


> Err... wasn't your post a perfect example of why patents exist?

Why? In this scenario, what would happen with a patent that wouldn't happen without a patent?


I worked on a software project make years ago. We spent a lot of money over months doing users studies to figure out the best UI for a narrow demographic.

The final UI was simple and intuitive, but it took a lot of money figuring it out.

I don't think the money would have been spent if our competition could immediately copy what we figured out.

Customers did benefit then, and now, 20 years later, anyone can do it, and humanity is little better off than if no research was done.


So... in this scenario, what would happen with a patent that wouldn't happen without a patent?

I don't know whether I'm missing something obvious, but with a patent, only the patenting company would use their patented idea. In your post you say:

> If you disclose your brilliant idea, everyone will copy it and your advantage in the marketplace will be transitory.

but that is the very point that patents are supposed to prevent. So why do you say that?

The post you're replying to says:

> I don't think the money would have been spent if our competition could immediately copy what we figured out. Customers did benefit then, and now, 20 years later, anyone can do it

so clearly the patent worked for them: they were able to use their simple and intuitive UI, while the competition could not copy it till 20 years later. So what is the question?


> “a brilliant stroke of genius the likes of which the world is not likely to see again. If that one guy hadn't been born, we would have gone thousands of years with no method”

But that’s not the criteria for granting a patent. It doesn’t have to be a stroke of genius. It can be something that many people could invent at the particular moment of the filing (as evidenced by many cases of near-simultaneous patent filings, like Daimler and Benz competing for the ICE in the 1880s). It just needs to be demonstrably novel.

I’m not saying tabbing back and forth through dialog fields qualifies, but then again it’s hard to place oneself in 1980.


The arrow keys, and enter, are the obvious ones to use, but you have to move off of home row to hit them. That's the "non-obvious" bit of using the tab key to navigate fields. Back when that level of usability was important.

> A usability feature that your users cannot use or know about doesn't increase usability.

Cannot is maybe doing a lot there. There's plenty of usability features that aren't really obvious or apparent unless you look very closely. Ex: pinball machines have timed shots, but there's almost always a grace period so if you contact the ball with your flipper around when the timer hits zero and it makes the shot, chances are you'll get credit for it even though the timer expired. That's a usability feature most users won't ever notice. At WhatsApp, I would never send an S40 user a verification code where the 4th digit was 8, because if you got a text message with 123-890, s40 would turn -8 into an 8th note emoji; until today, probably 3 people knew that ... but it dramatically improved usability.

> Even though the idea isn't obvious, the implementation is. If you disclose your brilliant idea, everyone will copy it and your advantage in the marketplace will be transitory.

> So... what is the purpose of giving you a patent? That cripples the marketplace, but it fails to realize the benefit of patents, publication. Publication necessarily had to happen anyway.

If I had gotten a patent on the 'avoid -8 in verification codes', then the technique would have been public for everyone to see. So publication for exclusivity / forced licensing is an exchange of value between society and the inventor. Of course, avoid -8 is pretty obvious, when someone testing the s40 client complains about getting an 8th note in their verification code message, you make a quick tweak to code selection to avoid sending those.

For an invention that must be disclosed to be used, society isn't really getting anything in return for exclusivity. Maybe promotion of progress, theoretically, I guess, in that whoever thinks of it first gets paid; leading more people to think about things?


You're missing the historical context. Prior to patents, inventions would commercialized as magic tricks and the mechanisms hidden. Then the inventor died and the secrets were lost.

For example, Cornelis Drebbel air conditioned Westminster Abbey in 1620. King James I (of the Bible) thought it was a cool party trick. But there was little ecosystem to commercialize and Drebbel moved on with his life, trying to sell other products with temperature controlled feedback loops + a submarine. Then he died.

The only commercialized invention of his was creating a dye that was redder than others. His son-in-law kept that a family secret and focused on selling this improved dye throughout Europe, since that didn't require revealing the secret.

The rational move was to give up on a multitrillion dollar HVAC industry to sell redder dye, since the second could be a trade secret.


> + a submarine

We still don't know how he solved the problem of carbon dioxide build up. We know he solved it, though!


Comments like this are the absolute best part of HN. Thank you for sharing this.

Likewise.

And after reading the Wikipedia article on Drebbel, how have I never heard of this guy?

I'm particularly curious how the Royal Navy failed to realize the value of the submarine.

Reading over the article on the history of the torpedo, it sounds like early attempts to weaponize, by Drebbel and others, were unsuccessful.

Even so — bearing in mind that this a undoubtably a reflection of my own bias as a child of the Cold War raised in the shadow of the largest military-industrial complex the world has ever known — I can't help but marvel at the fact that no spare-no-expenses crash development programs arose to operationalize effective submarine-based warfare by the naval powers of the time.


> the benefit you get from coming up with this idea is that you can publish it for the world to see, and that's the only way you can benefit from it

That’s your opinion, but it’s not the spirit of the law. I’m personally fully against Intellectual Property, including for movies and music, for reasons that are obvious (public money is being spent aimlessly trying to prevent two private individuals from copying things that are copied by their very nature of being published - or trying to prevent people from using ideas that are contagious - what next, put a copyright on political ideas? on dance moves? on beautiful colors?) but that’s not the law.

> we would have gone thousands of years with no method

There are other methods: The 4 arrows. The tab method is much more efficient and easy to implement, but we would have gone with the 4-arrows-to-navigate-fields method.


> move between input fields in a software form

IBMs earliest block mode terminals with field entry, including the 3270, predate the microprocessor. They were fully implemented with fixed hardware control.


IBM also infamously patented the XOR cursor.

Presumably it's to give you an advantage for putting in the work to develop it for a period of time.

You can say the same about swipe to unlock and that had been litigated to death.

I did say the same about swipe-to-unlock:

>> Something that's bothered me about user-facing patents


> I would guess that using the tab key in this way was part of a patent they were pursuing and Microsoft's use would show this to be 'obvious' and thus not patentable.

IBM insisting it not to be tab wouldn’t make sense. Microsoft was working for them and the programs should adhere to the CUA (Common User Access) standard.


OS/2 1.0 and the first edition of the CUA were both released in December 1987 according to Wikipedia; Raymond's story isn't dated but could've happened before this. (If I had to make a wild guess, I could imagine this request was a side effect of some internal IBM battle about what the CUA should dictate).

I imagine this is mostly about form-based applications, GUI or not, before the Microsoft pulled the rug from under IBM.

What?!?! I was an IBM Systems Engineer in the late 1980s / early 1990s and that was nothing like my job description.

Do you remember what the official definition was? I admit I was working at an internship in FEIS (Field Engineering Information Services) in Colorado and people with that title would occasionally yo-yo in to a meeting make some comment that didn't apply and then yo-yo out again. None of the engineers in the organization had anything but disdain for them. If you were late 80's, I was interning in the late 70's so its entirely possible that they restructured the job responsibilities somewhat. But again I'd really love to see what was the official job description from the time.

> IBM was consistent with their keyboard nomenclature across multiple products, and the 3270 series mainframe terminals used the Tab key

While it seems odd in light of IBM's usual adherence to corporate norms across business units, having read a couple different books on the origins of the PC at IBM, it may be related to the entire PC unit in Boca being an extraordinary aberration outside the norm for IBM. Despite seeming hopelessly corporate to the Microsoft team, the Boca IBMers were considered a "Rebel Unit" inside IBM - when they were considered at all, since the vast majority of IBM wasn't even aware of it.

Due to being started virtually overnight (in IBM timescales), running incredibly fast and only existing thanks to Thomas Watson, Jr. himself overruling his lieutenants to approve such a "skunk works", Boca didn't have nearly the level of oversight, coordination or control as an effort that size normally would. In the early days Boca ran largely outside normal reporting channels and when they'd try to source tech or components from other parts of IBM, had to sometimes clarify that they were in fact part of IBM.


From what I remember, there were two "Enter/Return" keys on IBM 3270 terminals. One was the regular "Return" key we have today, which just advanced to the next field, and didn't submit the form. There was also another "Enter" key where the Right Ctrl key is today, and that submitted the form. So, I presume, instead of being against Tab key, IBM might be against "Return" to be the form submit key as people who use 3270 would expect it to advance to the next field.

And that was true for many DOS programs. Pressing Return would just go to the next field, not submit the form. That was one of the things that needed some getting used to on Windows.


Your memory is correct, and it's interesting to note that on the IBM terminal keyboards, the Enter key was marked "Enter", and the return/new line key was marked "↵". On the classic IBM PC keyboards such as the Model M, the Enter key is marked "↵ Enter". I believe IBM chose this to convey that the Enter key on the PC was both an "Enter" _and_ "Return" key in one. As you say though - individual applications got to chose what that meant in practice, leading to inconsistent behavior.

Funnily enough, IBM had already published this. CUA explicitly says tab and backtab move between fields.

so they spent seven layers of management escalating against their own standard: https://archive.org/details/ibmsj2703E/page/n13/mode/2up


Not being familiar with the term CUA, I looked it up and TIL something (https://en.wikipedia.org/wiki/IBM_Common_User_Access). Since the doc you linked is dated 1988 and CUA was a brand new thing circa OS/2 (at least in traditional IBM timescales), the apparent inconsistency might be down to the massive scale of IBM in those days and organizational propagation delay.

Another factor could be still-reverberating echoes of the likely political battles around something as broad and far-reaching as CUA. I can only imagine the quiet boardroom battles won and lost fighting over CUA between different factions across all of IBM's kingdoms, divisions and principalities.


Also conspicuously missing from the story is what key IBM DID want to use. I mean... that's the first question you'd ask!

Lame.


It looks like it they wanted to use their existing special field management keys (field advance and field backspace) with tab being a different user experience. [0] Document does even use the word "Tab". "Field Backspace" seems to duplicate "Home" key usage under some conditions.

To be fair, Microsoft & Bill Gates are bad at quality user experience. "Ctrl+F" differs through their applications.

[0] https://archive.org/details/bitsavers_ibm525xGA2onDisplaySys...

*Edited.

The more I think of it the current TAB (SHIFT+TAB) key should of been used for entry navigation navigation only while the white space tab should of been something such as "SHIFT+SPACE".


"To be fair, Microsoft & Bill Gates are bad at quality user experience."

In some ways. Gates deserves never-ending enmity for plaguing us with backslashes in paths. But in others, Microsoft advanced the state of UI and UX more than anyone else in the '90s.

"Would of?"


> Microsoft advanced the state of UI and UX more than anyone else in the '90s.

There is no universe where that is true.


There is: this one.

Win95's UI was so incredibly influential that stuff introduced by it are still around to this day.


I don't really remember many Windows 95 firsts. One I remember is the ability to switch users without logging off. MacOS famously copied that (with a 3D cube look).

I think they made something really revolutionary at the IE3 time. Their News and Mail app was an Explorer extension that placed an e-mail reader as the presentation of a folder full of folders of mailboxes and messages. You wouldn't see the extension, as the apps launched as applications, but that's what the implementation looked like from what I investigated back then.

Unfortunately, the idea was seemingly abandoned almost immediately. I would love to have such views on top of a user-space file system keeping messages, address books, and calendars in sync.


At my first ISP job, I eventually started using mh for mail. It was based on an awesome concept of sorting everything into directories and having procmail and various helpers to pre-process, including upon receipt and reading. I remember little of the details, but it was truly for the gung-ho neckbeard crowd, and it was well-suited for processing "large amounts" of mail (1993 style). I think MMDF was the MTA trying to do similar things in that vein. Meanwhile my boss was in love with PINE...

Of course, working at an ISP I could also telnet to our NNTP server and read Usenet on the local filesystem. Ugh.


I think that’s right, but it’s fair to point out that Windows 95 was (if you believe Steven Sinofsky, who should know) heavily influenced by NeXT!

https://hardcoresoftware.learningbyshipping.com/p/009-passwo...


The use of recessed surfaces for displaying information and the rectangular buttons were very NeXT-like, but more compact because it needed to work at VGA resolutions, but I don't think they managed to capture the essence of their framework which is, impressively, still alive in every Mac sold.

I wonder how hard it would be to get NeXT source from the 1990's and compile it on macOS 26.


And the contemporaneous counterexamples are what? The various UNIX windows managers and X11? System 6-8 on the '90s Macs? None of those were great UI/UX IMO.

The big thing I remember from Windows back then were contextual menus (Windows 95 vs MacOS 8), the Start menu and Explorer (Not sure why the Mac never developed one - apps were easier to find, I guess) with a folder tree on the left, which Finder lacked (but you could always have two windows with different views). In general, the user experience with Macs was smoother than with Windows, with the move to PowerPC being a huge improvement in performance over the 68040 models.

As pointed out elsewhere, NeXT broke a lot of new ground at that time, thanks in part to its Unix underpinnings. Also Adobe brought great font management to both PCs and Mac before both embraced TrueType. Next had sub pixel anti-aliasing from the start.


Not sure the case that the parent refers to, but there's a good reason that CTRL-F in the Win95/Exchange Mail client and Outlook will invoke the Forward email message command.

It goes back to what is the common action that the user would perform in the app. Forwarding an email is more common that Finding text in an email - at least to Billg.

see https://devblogs.microsoft.com/oldnewthing/20140715-00/?p=50...


Ctrl-F6 ?

Shift Ctrl F6

I am trying to think of other older examples, but I know I have seen it before Workspaces/Chromebook:

The upstart adopts all the keyboard shortcuts of the dominant player. Then the existing userbase is comfortable... compatible, with the new and different software.

Then, slowly but surely, new novel shortcuts are introduced and you gradually find the “compatible” ones vanishing or conflicting or just glitched, until one day you’re no longer capable of using Apple or Word or Netscape or Excel.

I propose that IBM saw that coming, and fought against a future where IBM-trained users easily adopted someone else’s apps.


And this is a great example of why I read the comments on HN stories... thanks for the info!

RIP /.


VMS provided some functionality to convert floating point faults into traps: https://docs.vmssoftware.com/vsi-openvms-rtl-library-lib-man...


Sad to hear this, one of my first mechanical keyboards was a Filco TKL. At one point in time, it was my go-to "safe recommendation" for a keyboard. Since that point in time, the Majestouch keyboards only received incremental improvements, whereas the likes of Keychron completely overtook them on almost all criteria.


> But I wonder how it seems to people who understand how it works?

As someone who mostly understands what's going on - It does not seem like wizardry to me, but I am very impressed that the author figured out the long list of arcane details needed to make it work.


> The only potential bad news—and that heavily depends on your perspective—is that the new chips’ built-in NPU falls far short of the 40 TOPS that Microsoft requires for PCs to earn the Copilot+ PC label.

An interesting detail, given the ongoing rumours that the next major version of Windows will require an NPU with a certain amount of performance.


Quite likely - the buckling spring switches in Model M are quite stiff as far as keyboards go. Brown switches are a good choice if you want a light switch with some amount of tactility.


In addition to the three physical keyboard layouts in this post, there's a fourth one which has an extra key on both the right and the left side of the keyboard. An example is the Brazilian Portuguese layout Model M (pic: https://www.clickykeyboards.com/wp-content/uploads/2023/03/I...). The Apple test would be able to identify it, although I've never tried it in practice. I don't think the modern Brazilian Apple keyboard uses this arrangement.


> there's a fourth one which has an extra key on both the right and the left side of the keyboard. An example is the Brazilian Portuguese layout Model M (pic: ...)

That's the ABNT2 keyboard layout, which is the keyboard layout used here in Brazil. AFAIK, it's the only common keyboard layout with that characteristic.


The Thinkpad also likely cost far more than $600 when new. Even a several-year-old flagship laptop is going to be superior in some respects than a brand new laptop designed and produced to cost as little as possible.


Which VMS niceties does it offer?


Proper file locking, asynchronous operations across everything, ACL based security, proper ABI.

Not being an OS from C to C as the main programming model.

And then on top, multiple levels of sandboxing, including virtualization of drivers and kernel modules.

Ah and RDP is much nicer than X Windows or VNC.


Other than possibly proper ABI, and yes a tiny handful of file operations that could theoretically block not available through io_uring, like ioctl and splice, Linux has the rest.


In security? Not really, unless you are doing immutable deployments with rootless containers, no shell access, which at the end of the day isn't UNIX any longer.

And which Linux exactly? Plus unless you're doing C or C++, most likely aren't using those APIs.

Anyway, the differences of bare metal servers don't matter in the days of cloud where the actual nature of the kernel running alongside a type 1 hypervisor hardly matters to userspace.


In the UK and Ireland, a pint is 20 oz. (equivalent to just over 19 US ounces), so I always feel cheated by 16 oz. "pint" glasses in the US.


It is the same in Canada [1] yet I frequently see beer sold in "US pints" over here. I assume they do it so they can advertise cheaper prices (the amount being smaller). Some places will write the glass size in ounces, but some won't.

It is one of my pet peeves for sure.

[1]: https://ised-isde.canada.ca/site/measurement-canada/en/buyin...


Also Canadian. I don't often see "pint" on the menu, usually something like "16oz." Evidently restaurateurs and bar owners are wise to the law. Though I am pleased when I see "20oz" on the menu!

I kind of understand the logic by not serving 20oz and saying "pint". Customers might avoid a place because their "pints are more expensive", when in reality that place is also serving them 4oz of extra beer. A bit like the classic 1/3 lb cheeseburger being "smaller"[1].

Annoyingly, I do find that servers will often refer to their larger size beer as "pint" regardless of whatever the menu says.

[1]: https://en.wikipedia.org/wiki/Third-pound_burger#Marketing_f...


I have a buddy who used to call Weights and Measures on bars that passed off US customary pints as "pints". It is illegal, but enforced largely by complaint.


Also in the US (probably due to lack of training and the customer too embarrassed to complaining) tend not to fill it the brim (and so not even 16''). I've seen 2-3 inch heads and asked them to top it up. They look at me as if I've just insulted George Washington.


Well, depending on the type of beer, that's intentional. It's not always the faux-pas that it would be to do this when serving cask ale in the UK.


But usually when that is the case they will use glassware that has a 20oz line on the glass with room for the head.


Your pubs kindly return the favor when we order whiskey. As Hunter S Thompson is reported to have quipped in a bar your side of the Atlantic: "What is this, a sample?"


That's fair, can't argue with that one.

Personally I'd have us use what the Royal Navy used to serve its rum ration in, the half-gill. This is 1/8 of a British pint or 71 millilitres, and the rum would have been a minimum of 54%!

Fractional gills were the pre-metric shot measure in the UK, but they were still pretty stingy. 1/6 gill in England, 1/5 or 1/4 gill in Scotland, and 1/4 gill in Northern Ireland.


A pint in the Netherlands usually is 500ml. In very rare cases, but only in real pubs (not mass market "Irish" pubs) you get an actual pint. So you are cheated out of about ~68ml in that case. Vs the US you get a few ml more.


As far as I knew, Netherlands pubs typically sold:

- 200ml "fluitje" (little flute)

- 250ml "pintje" (little pint), often sold in a "vaasje" (vase, a tapered beer glass). This is the typical beer measure: https://nl.wikipedia.org/wiki/Pintje "Het bestelde glas pils heeft doorgaans een inhoud van 25 cl"

They also sell standard bottled beer in 300ml and standard cans in 330ml: https://nl.wikipedia.org/wiki/Standaardglas

I was not aware that 500ml was usual for the Netherlands. It is usual in, say, Germany, where they also sell the 1 litre Maß


The Maß is only a thing in Bavaria and strongly Bavarian-themed places, and almost nonexistent for bottles or cans anywhere in Germany. Faxe (which is Danish) sells one liter cans and some Czech brands sell or used to sell 1.5 liter plastic bottles - that's about it. The next common size is 5 liter mini kegs.


Do they actually call it a pint or just a half litre / large beer?

That's seems to be the norm in a lot of mainland Europe.


In France it’s 500mL and it’s actually called a pint


Same thing here in France. Except I've never seen any "real" pints here, it's always 50cl.


A standard US pint is about 473ml so a US pint is ~95ml less than an imperial pint.


In the 19th century, at the same time they went stone-mad (and redefined the hundredweight as 8 stone or 112lb), the British redefined the pint as 20 oz.

After this point, there was no where the whole world round where a pint was a pound.

(The US standardized on the wine gallon, so a US pint is and was 1.04 lbs.)


It's not completely uncommon to be offered 16 oz or 20 oz as options in the US. But I see it more at "fast casual" restaurants than bars or more upscale restaurants.


And of course, what an "ounce" means may vary. According to Wikipedia "An imperial fluid ounce is defined in British law as exactly 28.4130625 millilitres, while a US customary fluid ounce is exactly 29.5735295625 mL, and a US food labelling fluid ounce is 30 mL."


The volume of UK and US fluid ounces being different also doesn't help.

The UK pint is 568ml, apparently a US pint is 473 ml.


This is why I get agitated when Americans claim to use imperial units. If they did, their pints would be the correct size.


Americans don't claim to use imperial weights and measures; they use customary weights and measures, which were also used in the UK prior to the creation of imperial units with the Weights and Measures Act 1824.


There are many people in America who do not know the difference, the mistake is fairly common.


At this point they are just American units, right? Since the UK has upgraded already.


The origin of US Customary units is British, even if the US, Liberia and Myanmar are the last countries still using it. The UK has almost entirely adopted metric (yards and miles are still used for measuring distances on roads and pints are still used for milk and beer, and the last government made the eccentric decision to permit pints for wine, which no producer used because they couldn't get the bottles), but these systems of units have identities beyond whether or not they're in use anywhere.

EDIT/CORRECTION: Milk is sold in multiples of 568 mL, so while the quantities are pints, the measurement is metric.


> EDIT/CORRECTION: Milk is sold in multiples of 568 mL, so while the quantities are pints, the measurement is metric.

What distinction do you intend to make by that? 1 pint is 568ml.

If you mean in labelling or something, no, they're marked 1/2/4 pints. Usually also with litre markings. You can also get metric sized bottles, i.e. on the supermarket shelf you'll often see one brand's 2 pint bottles next to another's slightly smaller 1l bottles.

The supermarket price labelling will be in £/litre, regardless of whether the bottle's pints or not, if that's what you mean?


Beer and cider are the only drinks that are legally not sold by metric volume in the UK. They have to be served by the pint, 2/3, 1/2 or 1/3. Every other drink has to use metric.


But that just means the quantity has to be expressed in metric units, possibly in addition to imperial, correct? E.g. I currently have a carton of milk in my fridge that’s labelled “2272ml 4 pints”.


Not for alcohol measures. Beer and cider have to be sold in pints, and there is a list of allowed sizes used for other drinks. Also the size of the standard measure used for spirits needs to be displayed on a sign at the bar.


Apologies, I was specifically replying to your last sentence, "Every other drink has to use in metric."


Not really. The UK uses imperial units for most of the things you use units for in daily life (roads, cooking, drink sizes, body weight, utilities, land area...), even though they theoretically converted to metric. Canada is similar.


> The UK uses imperial units for most of the things you use units for in daily life (roads, cooking, drink sizes, body weight, utilities, land area...)

Not really. Old people might cook with funny old temperatures/measures and weigh themselves in stones, but it's fading out, contemporary cookbooks and gym culture are all metric. I've literally never seen a utility bill in anything other than metric (even if it's slightly weird metric like kWh or cubic metres of gas).


_Human_ body weight. I grew up measuring everything in kilos apart from people, which has I guess what amounts to its own wholly idiosyncratic scale, the stone, that no one I've since met outside of the UK has heard of.


I don't know why really, it's just 14lb, why does the US/Canada just stick with very large numbers of pounds instead of breaking it up as with others?

Kilograms seem more and more common for human weight too though, largely driven by fitness apps & communities I think. I doubt children in school today are accustomed to stone; only pounds and ounces for birth weight perhaps, but even that is metric medically and converted for the parents' familiarity these days I believe.


> _Human_ body weight.

Fraid not.

No medical professional in Blighty weighs people using imperial measurements. The only people who really use them are the elderly and (bizarrely) the type of crappy slimming magazine seen at supermarket chekouts...... The kind satirised by Viz as titled "Less Cake, More Exercise".


That's why we call it the US Customary System.


The (incorrect) claim is indeed made in every single metric vs "imperial" comments section I've come across.


Many Americans do claim to use imperial units. They’re wrong, but they do claim it.


surely if that was the claim George Washington would never have had his dream

https://www.youtube.com/watch?v=JYqfVE-fykk


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: