There’s no shell in the image, there’s no utilities (analogous to what we used to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon designed to accept incoming data...(and the fact that the firmware is stored in RAM rather then ROM)...
But there could be. In fact, it could do almost anything as its a powerful little computer in its own right and it sounds an awful lot like it boots the kernel and then runs a daemon in a most familiar fashion. I can think of a few delightful applications off hand. There will be some epic hacks. Is there like a "Bunnie" signal?
A little thought experiment... some clever dudes manage to reverse engineer enough to "jailbreak" this, and then they put a little OS image on it and start hacking away... then a bunch of people say "hey, this is really cool! It's like Apple made a little raspberry pi for us!" but then even more say "but it's so limited, it doesn't have USB out, etc, etc."
And then dozens more just go on and on about how Apple "crippled" the device by not giving it USB and how this "proves" Apple just wants "control" and why did they have to jailbreak it anyway?
In the process of getting their panties all bundled up they never realize they're bitching about an adapter not being a general computing platform.
They're also proving Apple right-- it's engineered to solve a specific problem and provide specific functionality. Even if it were jailbroken from the factory, people would be complaining and demanding that it does other things... than what it was designed to do.
You're extrapolating everything into a very silly future-tense strawman that won't ever exist. Nobody will be complaining that Apple didn't put USB in a proprietary SoC intended for video transfer.
"And then dozens more just go on and on about how Apple "crippled""
Exactly. "Dozens". Apple isn't building products for "dozens". They are building for the people who literally stream into the Apple retail store (non techies who clamor for their products) at all times of the day when other stores sit idle. I can walk into the Apple store near me (and it's not in Manhattan but in a suburban mall) and while the other upscale shops are idle, say, Tuesday at 10am the Apple store is quite busy.
>It's a strange day when we move from cables that work perfectly well to cables that need to be software upgradeable to work as well.
Yes, it's a strange day when we move from "cables that work perfectly well BUT the device with the ports they connect to has to be replaced whenever a new technology cames along" to "cables that need to be software upgradable but are far more future proof and capable all the while freeing the device from those concerns".
The stereo beside me has components that span 30 years and uses the same cabling for all components.
I realise someone will come back with "Oh, but DRM, encryption, needing more information about the source and destination"... but I'd counter by pointing out that telecoms cabling works perfectly well for transporting all kinds of things across it. You can put into protocols the things that you need, without having to create hardware problems that require constantly replaced and software upgradeable cabling.
>The stereo beside me has components that span 30 years and uses the same cabling for all components.
Yes. And it does just one thing. Output analog electrically encoded sound signals.
Now, call me again when that stereo cabling has to also support: audio out, digital audio out, video out, different resolutions, MIDI out, connecting to medical and scientific devices for control, charging, data backup and sync, ethernet, etc etc...
>I realise someone will come back with "Oh, but DRM, encryption, needing more information about the source and destination"... but I'd counter by pointing out that telecoms cabling works perfectly well for transporting all kinds of things across it.
Only because there is a computer in the other (either an actual PC or a TV, etc) that knows what to expect and how to decode it and show it.
In this case, the computer is inside the cable, so the connected devices don't have to know everything.
The real issue here though is it's a solution to a non-existent problem. This was potentially a problem before HDMI was a standard. That's no longer true though. Modern HDMI does video, audio and has an ethernet bus standard right on the cable.
All Apple is doing is proprietary lock-in here - it's not more flexible in anyway, where as HDMI is signal compatible with DVI and DisplayPort.
_Modern_ HDMI supports ARC, 3D and Ethernet. The early versions didn't, and you have to upgrade components to get the functionality. Apple is solving a real problem here: next time the HDMI consortium adds another random feature to their bus, Apple can just ship a new adapter that will work with all Lightning devices since none of the HDMI hardware is in the phone.
But you have to buy new HDMI cables every time the standard is upgraded. Right now the ethernet bus is only 100mbps, and it only supports 4k at 24fps; when these are improved = new cable. HDMI also can have latency issues, if you want to play something like RockSmith you need to use an analog output to circumvent it.
The amount of engineering that goes into making devices work over the huge installed base of cat5/cat5e/cat6 twisted pair is pretty amazing; I was amazed Gig-E worked, let alone 10GE.
If there weren't the huge installed base of wires in the walls of buildings, we probably would have different standards for patch cables instead of fairly obscenely complex network interfaces.
all use Ethernet and standards are set for them, all peripherals need to talk to it in same protocol and newer devices should be back compatible .
But with lightning you can design cables which need not use device computing power for all transformations, it is pretty smart, they built this cable for the next decade we will see interesting applications from apple and its partners in coming years
The protocol certainly is not same and not exactly backward compatible. 1000-Base-T is fairly complex interface that is more similar to S(H)DSL than to traditional ethernet over TP or even anything that would any sane person call "baseband interface". By the way this is the reason that SFP/GBIC to 1000-Base-T transceivers are not supported by all devices with relevant slot, are not exactly compatible between manufacturers and sometimes are not compatible with non-gigabit devices on other end of the link (ie. not backward-compatible on the RJ45 side of things).
And before than in the days of first fast ethernet implementations there also were incompatibilities of similar type, which is mostly the reason that every managed or even "smart" switch allows you to disable auto negotiation quite prominently in it's configuration interface. Although that was more about software issues than about interoperable implementation being non-practical as is to some extent the case with copper SFP modules.
Bottom line: "I don't know what the next network interface will look like, but it will be called Ethernet and use RJ45 connectors on CatN (for some value of N) twisted pair ..."
I find your example lacking in perspective. In telecom, even under TDM standards like SDH, SONET, etc., there were a plethora of incompatible connectors. You can have rj45, rj12, BNC (literally dozens of variations). In optics the same is true as well with SC, ST, LC, FC, etc. The last decades shift towards Ethernet everywhere is as a result of "mass" consumer desire (i.e. you guys on HN building things that use Ethernet. A protocol originally designed for a limited number of workstations in a small environment.) and that has put the RJ45 at the top for a lot of devices. Recall that 15 years ago AS400, ATM, FDDI, all had their own connectors and standards. All of this (and much more) just to send some bits over a physical connection. Just crack open a Grays catalog and go to town.
For the record each of these has a purpose and reason. Some were a function of the materials present at the time others due to specific environmental concerns. I would not class any in the realm of "lock-in". Due to the capital intensive nature of the industry they each were good business decisions at the time. It wasn't as easy as buying a $40 plug.
Which actually doesn't make any sense. Sure copper and cable has been around for thirty years but as Google Fiber, Verzion FioS, and AT&T U-Verse show it's often required to install new cabling and other hardware. This is true of Phone (dsl upgrades) and Cable internet also. I used to work as a Subcontractor for Time Warner's Road Runner internet service when it first arrived in North East Ohio. Part of the common work was installing new routers and switches after the new cable lines were installed in the individual neighborhoods.
It's not clear that it can support "absolutely ANYTHING" for values of anything that are any greater than can be achieved with USB 2.0 OTG though. So far it hasn't.
His first sentence is "Airplay is not involved", and then he proceeds to tell you that all of the parts of Airplay that are responsible for the drop in quality are involved. Specifically, a compressed video stream is created and that is sent to the SoC for decompression. That compression/decompression cycle is responsible for the quality degradation in the video, and (presumably) the limitations on the output resolution.
"We didn't do this to screw the customer". Sure you did. You have offloaded the cost of a parallel connection, necessary for HDMI, to the adapter. Almost every competing system comes with HDMI built-in. 30 pin connectors could drive HDMI. Lightning can't without having another whole computer in the adapter.
What makes this actively hostile to the customer is Apple's proprietary adapter design. If there was a real, competitive, standardized market for the various adapters, on balance it would be a good deal. But there isn't.
Magsafe is nice, but if you have any of a number of common problems with your Air's power supply/cable, you are in for an $80 charge to buy a new one. Even if the only thing wrong with it is pin spring that's worth pennies.
Because Lightning is coupled to proprietary decoders and Apple patents, it's customer-hostile.
And, as we've seen here, the result has poor quality. If the quality was high, then we might be able to overlook it. We shouldn't.
> "We didn't do this to screw the customer". Sure you did. You have offloaded the cost of a parallel connection, necessary for HDMI, to the adapter. Almost every competing system comes with HDMI built-in. 30 pin connectors could drive HDMI. Lightning can't without having another whole computer in the adapter.
So every competing product charges every single user, most of whom will never use it, for HDMI output. Apple charges (more, to be fair) only those users who need the feature.
How this gets interpreted as "screwing the customer" baffles.
You are easily baffled, then. Apple doesn't just change the connector design. It also performs authentication/identification/verification on the objects it is connecting to.
If you're Apple and you want to make lots of money from your very popular products, one of the ways you can do that is to make them incompatible with the competitive markets for standardized accessories. You can come up with your own connectors, and you can use patents, litigation, and trade secrets to ensure that you have no competition, and that your users have no choice but to buy high profit accessories from a single source.
If I'm Apple, I don't need to come up with some kind of trick to make lots of money. I just need to keep releasing best-in-class hardware at competitive prices by leaving out expensive features 99% of my customers will never want.
I love my iPhone 5, not because of patents or lawsuits, but because it's really, really well built. I definitely couldn't have gotten it cheaper from anyone else. How many overpriced accessories have I bought? Zero. Because I don't need that crap-- and Apple doesn't make me pay for stuff I don't need.[0]
Yes, yes, of course you are right. Except for the hardware video codecs that are necessary to make it work. A protocol requires a combination of software and hardware that can execute its specification.
He doesn't say, but probably because they just don't have the bandwidth.
And thats where all this future proof talk goes deaf. New interfaces will only have more bandwidth, thats the whole point. Meanwhile, Lightning still has the same bandwidth and theres still a considerable penalty in serializing any other interface.
If they can't even do 1080p lossless right now, they are in much deeper trouble for the future.
> If they can't even do 1080p lossless right now, they are in much deeper trouble for the future.
Let me try and shed some light on this mystery. Consider this rumor. Also, I am not an electrical engineer, so I may be talking out of my ass.
As best as I can tell, Lightning is not (yet, anyway) a real protocol like Firewire, USB, etc. What it is so far is USB with a different connector and some negotiating chips.
So when connect your lightning-USB cable to your iPhone, your iPhone squawks and says "What is this?" and the cable says "I'm USB". The iPhone then sets its pinout to USB mode and off you go.
What's so great about this, why not just use USB? I think the secret sauce is that Apple wants to surreptitiously invent new pinouts with faster data rates, without consulting any standards bodies and having to rally industry support on TVs, computers, et al. They just build some new controller to push Lightning at 2Gbps+, put it in new iPhones, and build active cables that spit out HDMI or UHDTV or USB3 or whatever it is the kids are doing these days. The active cables degrade gracefully for hardware running at the slower data rate.
What I think you are seeing right now, is the dry run with the off-the-shelf controller. Get manufacturing ramped up. You need this to plug your phone into a computer anyway. But it's just the tick. Wait for the tock.
Now the only thing that puzzles me is why they settled on 8 pins, when USB3 is 9. Obviously they wanted an even number so you could plug it in every which way, but you would think stepping up to 10 pins would let them use off-the-shelf USB3 controllers instead of USB2. Maybe they can do USB3 on 8-pins somehow, or maybe the tock will be ready fast enough that it's not worth it.
Yep, and the comment about them using already existent h264 support in hardware I think supports dry-run-fastest-to-deadline solution.
Re pins, I'm not EE either, but I don't get why people concern about number of pins so much, as I asked earlier in this thread, is there a relation between number of pins and bandwidth (well with physics involved there is _some_)?
Still not an EE, but as I understand it, if you are creating your own controllers and you also control how the cable is shielded, pin count doesn't matter much except for power.
Traditionally it's been easier for most hardware makers to double the pin counts than to build controllers that run at twice the clock rate and also make everybody use fancy cables. But traditionally not everybody has PA Semi across the hall to build chips for you and years of experience selling $50 cables to consumers via direct retail. So I'm betting that the usual economics of the consumer data interface market don't apply to Apple.
Has this been true since the late 80's/ early 90's when serial ports left parallel ports behind?
With serial techniques such as differential encoding parallel transmission off circuit board has been obsolete for a long time. The potential skew between the pins is too great and synchronisation too complex.
There is pretty significant relation between number of pins and prices of almost every piece of hardware involved. IC package price is almost linearly dependent on number of it's pins and for most mass-manufactured chips packaging is major component of final price. Also wide parallel interfaces are harder to design from interference perspective and to route on PCBs. General shift to essentially serial interfaces for almost everything except main memory is motivated by simple fact that trading silicon area used by serialization/deserialization logic for less device pins, simpler boards and simpler system design is very worthwhile whenever it's at least partially possible. For very slow interfaces it has been this way for at least 30 years, since mid 90's for slow system buses (that's why around 2000 motherboards stopped having ISA slots, ISA was replaced by serial LPC interface around that time) and since early 00's for almost everything in non-specialist applications.
Why wouldn't they use H.264 and their existing hardware ?
Do you really think it makes more sense to decode H.264 on the device then re-encode it in another format and then decode that on the adapter ? Sounds slow, illogical and bound to introduce more artifacts.
You'd need 3Gbps to push uncompressed 1080p data. If you look at USB 3.0 and make a not-unreasonable bet that Lightning is slower, then you simply don't have the bandwidth to do so.
As well as bandwidth, power consumption. Chucking 3Gbps down a long wire is going to reduce battery life for a mobile product. Perhaps the controller can support it, but heat dissipation in the phone is significant?
Actually, that doesn't make sense - the adapter is powered from the iPhone or iPad and still has to chuck 3 Gbps down a long wire, except now you're using the iPhone's h.264 encode hardware and an entire ARM SoC in the adapter which you have to power too.
Retina iPads already need to re-scale the image since they're higher resolution than an HDTV. Why force the little adapter to do (some pretty massive) scaling and encoding when the iPad has a very nice GPU?
It seems to me they had a system for outputting video via h.264, and they decided to use it again here. Seems like a reasonable decision at that point.
Well, there was a big whoop about the thinness of the connector when they announced it being necessary for very thin devices. Can we manufacture non-brittle connectors with center pins that are 1.5mm thick?
If you count them there are actually 16 pins. Apple wants you to think there are only 8 pins and that the connector can be reversed. But nothing prevents them from adding some additional handshaking in a future device that lets them use all 16 pins separately. The connector could still be reversible with additional switching circuitry that routes signals to the correct place depending on how you have rotated the connector.
And HDMI does it effortlessly. That's exactly what these interfaces were designed to do: raw massive data with low latency over short distances, suitable for processing in dedicated circuits, not SOCs.
> suitable for processing in dedicated circuits, not SOCs.
Your splitting hairs. Pretty much all HDMI transmitter ICs have a microcontroller on board, they need a processor to deal with the protocol configuration, DDC and HDCP. Probably most are using some 8051 design plunked into the chip. They are SoCs by any reasonable definition.
Yes, of course. But they have the the actual transfer of the raw data offloaded onto specific circuitry that mostly just passes it through, something the processor part of a SOC would be terrible at. That's not what Apple is doing; they take in a normal lossy compressed video stream (H.264, who knows) and then reconstruct a HDMI signal.
The only difference between what you've described and what is going on in this adaptor is the inclusion of h264 decoding. The use of this is an argument that can be had (personally I think it was a great play on Apple's part, see my comment here: https://hackernews.hn/item?id=5308345)
The rest of this is ancillary stuff. They aren't decoding h.264 on an ARM core, that's impossible for any significant bit rate, they are doing it with a purpose built bit of hardware, just as the HDMI encoding is done in a purpose built bit of hardware. Incidentally the encoding of HDMI is a mess, the spec is worth a read some time.
But then there's 4K, if we're talking about future proofing.
Well, say right now it cannot transmit 1080p60 raw. Could it with new hardware in device and adapter?
Or put another way — does number of pins limit bandwidth in any way?
> Well, say right now it cannot transmit 1080p60 raw. Could it with new hardware in device and adapter? Or put another way — does number of pins limit bandwidth in any way?
Based on what has come out about lightning it appears to have 2 differential data channels (same as USB2). There will be some upper limit on frequency but its impossible to know without detailed specs. Its also not clear if one is locked as send and one as receive, or if they are configurable, there is an ancillary control channel so anything seems possible.
For comparison, HDMI has 3 data channels, each with raw bit rate of up to 3.4Gbit/sec (~2.7Gbit/sec data).
So, you're saying bandwidth increase while maintaining backwards compatibility won't be possible through the Lightning interface? Surely, USB 1.0 to USB 2.0 demonstrated a counter-example to your statement.
Yes, and? Lightning 2.0. Sure. That doesn't mean the physical interface changes. That doesn't mean there isn't backwards compatibility. That doesn't mean there isn't forwards compatibility. "And thats where all this future proof talk goes deaf" argument made no statements to support itself. "Meanwhile, Lightning still has the same bandwidth" is like saying "Meanwhile, USB still has the same bandwidth with USB 2.0 as it did with USB 1.0". There will be higher bandwidth Lightning 2.0 peripherals with Lightning 2.0 devices; but Lightning 1.0 peripherals will still be compatible. The Lightning interface doesn't establish some sort of permanent bandwidth cap to its future backwards-compatible revisions.
Do you mean the future where we're streaming 4k video? If HEVC (aka H.265) lives up to its promise, 4k video will come in around a mere 50% more than 1080p does now.
Meanwhile, lossless 1080p would require 20000% more space (no I didn't add too many zeros, that's 200 times more).
This isn't streaming, its the last 50cm from your computer to your display. The 200 times more of data are already handled without a sweat by today's interfaces. Dual link DVI, from 1999, can already do more than 1080p.
Apple are trying to handle all technologies without custom hardware in the device. Effectively, they have turned it into a streaming scenario. It's not as weird as you'd think -- there are USB3 video adapters that work the same way.
DVI, HDMI and MHL all require custom hardware in the device. As the Apple engineer said, Apple are trying to avoid this.
Taking 1 master protocol (DisplayPort) and converting appropriately would make a hell of a lot more sense then starting with your own proprietary format, and putting compressed video decoders in your cables.
Avoiding custom hardware in devices just seems ridiculous - to support new standards Apple are either going to have to update their chips or update their devices, and they will have to update the cables too. There's not a saving here that isn't achieved by simply using an existing standard.
Sorry but are you talking about the right connector here ? Lightning is for the iPod, iPhone and iPad.
Of course dual link DVI et al can do more than 1080p. They are all huge. Apple's needs to support not just today's thin devices but those for the next decade (iWatch ?).
Because MHL isn't USB. It uses the same plug but runs its own HDMI hardware connection over the pins -- exactly what the Apple engineer said they were trying to avoid doing for every single hardware setup.
There's every indication that Lightning can handle much more than this but in order to get the hardware running quickly, they've reused settings that were previous used for WiFi data rates.
All we can guess is that Lightning can handle somewhere between 10Mbps (AirPlay over WiFi) and 2Gbps (native HDMI rates). We have absolutely no idea where in that range its actual capabilities lie.
I'm guessing its actual capabilities are basically just USB 2.0. Someone tore apart a Lightning-to-USB sync cable ages ago and the data pins are apparently wired straight through - so we know that Lightning can speak it natively - and given all the focus on simplicity and not doing multiplexing at the expense of more complexity in the adapter I can't see any reason why Apple would develop their own protocol.
That means that the CPU reconfigures the pins into USB2 mode, which means that Lightning can at least do 400Mbps. This is because any SoC nowadays can do low-level USB-device pinout, so it's reasonable to simply pass it through the lightning connector, unlike HDMI or DVI which require specific video encoders.
It says nothing about Lightning maximum speed. Besides, whatever maximum speed can be measured with today hw, it doesn't mean that can be pushed tomorrow with different HW in the iDevice.
Given there's nothing that can currently use more Lightning bandwidth than USB 2.0, and apparently 3rd-party manufacturers don't have access to anything except USB 2.0 and slow TTL serial, it'd be surprising if it did support anything else. Also quite expensive - Apple's hardware USB implementation is almost certainly third-party IP they've bought in and dropped into place unamended, modifying it to multiplex another protocol they don't need is a waste of money.
Besides, if we're talking about hypothetical future hardware, there's nothing to stop someone doing the same with micro-USB. (In fact, manufacturers already have in the form of MHL.)
I might be really confused but why is Apple going to be in deeper trouble ?
Surely the future will be H.265 content which will have better quality for the same bandwidth. And it's unlikely that people are going to be demanding a higher output resolution than 1080p from their mobile device for anytime soon.
Again, this is a connector from your mobile device to a display. Nobody has ever done any lossy compression on that pathway; both because you can drive simple circuits at frequencies high enough to have plenty of bandwidth and because it would actively destroy information (hence lossy). This is such a big problem because the video you are playing has already been lossy compressed (H.265) and it will most certainly not get better by a second pass that only has the raw pixel data available.
I could imagine people want to play 4k movies from their mobile device on their TV, but you wouldn't install a 4k screen as the display on the mobile device.
Edit: Yes, computer is inaccurate. I wanted to get the idea across that we are talking about connectors (DVI, HDMI) that you would normally use to hook up a computer or laptop to a TV or LCD. They have only recently appeared on mobile devices, but serve the same purpose here: video (and audio, for HDMI) out.
I think that if you have a 1080p source on your phone, it goes to tv as 1080 and without another compression step, it's only mirroring that reduces the resolution (which, for ipad mini, is less than 1080p btw) and introduces compression pass
a) This is a connector from a mobile device to a TV. That is the primary use case here. I doubt anybody is hooking up an iPhone to any other type of display.
b) Why is there a second pass compression/decompression stage ? Isn't iOS outputting the compressed H.264 stream and the adapter decoding it i.e. one stage ?
c) Given that Retina displays by definition are the best resolution we will need and it is far less than 4K it is questionable whether there will be a use for 4K on mobile devices. Other than using your iOS device as a media player for your TV (very small use case).
I think the main takeaway is, why pay for a $50 widget when a $5 cable will do the job?
For the consumer who just wants to connect their device to a HDTV, it's crazy they have to pay 10x more just because they're in the Apple ecosystem and not Android.
As for being future-proof, by the time some cool new A/V interface hits the market, and we've all updated our TVs, the Lightning devices of today will be in a museum.
Somebody has to pay the piper-- the hardware to transmit HDMI has to exist somewhere. With a smart adapter, only the people who need the feature have to pay for it.
Seriously ? All the arguments about Apple products "just working" and being "premium products", now change to "only people who need the feature have to pay for it" ?
"Now"? The iMac G3 didn't have a floppy but I'm sure you could buy an external floppy drive. Macs after that rarely had a full VGA or DVI port, people had to buy adapters. Starting with the MacBook Air, we had to buy external optical drives, external ethernet adapters, external FireWire adapters... This is not news. And I'm saying that as someone who hates Lightning with all their heart :)
But given economy of scale, they (per person) will pay more for it than the amount they would have paid if everybody bought that hardware in their phone.
And economy of scale does work. I do not think Apple can seriously undercut Samsung because they managed to leave out a HDMI chip.
I think Apple's reason for this connector is more one of aesthetics: why have X > 1 chips that can communicate with the outside world and, typically, X connectors? Full wireless is not yet an option, so they need one. Then, they need some way to figure out what is on the other end of the line.
Why they didn't pick USB3, I don't know. Not proprietary enough? Connectors too bulky? Low power spec not (yet) available? Not flexible enough? Supports too many devices? (If you put an USB connector in, people will expect that it works with their hard disk, photo camera, keyboard, mouse, etc)
You know you can buy Lightning cables off eBay for $5.
And that the original dock connector first appeared in the iPod 3G nearly a decade ago. So yes today's devices will be in a museum. But the connector could still be in use in 2023 and beyond. Worth keeping in mind.
Yes, you can get a Lightning to USB cable for charging and syncing for $5, but to connect to a TV you need to get the Lightning AV adapter which costs $50.
The interesting point to me is that a device can be future proof through iOS updates - i.e. for as long as apple is prepared to create software drivers/updates to support the peripheral. Obsolescence is now almost entirely a software rather than hardware problem. Features can be added or removed via updates.
Firmware loaded at runtime is not new (see many of the wifi/bluetooth dongles that don't work in linux).
However, would apple let 3rd party peripherals download driver code? I wouldn't have thought they want to keep baking it into iOS itself as the number of peripherals increase. Perhaps drivers embedded in a controller app via the app store?
I like the idea that with a thunderbolt<->lightning adapter (and a huge amount of hacking) iOS peripherals are a blank canvas and could be used with other non apple devices or for purposes never originally intended (that VGA card as a software radio transmitter springs to mind...)
It all seems liks a fairly logical progression from the company that brought us Firewire. If you don't like it instead of whining, vote with your wallet.
Personally I believe this is a smart move and we'll see far more exciting things running on this interface in future. Plus myriad related patents.
I think we can all agree that pushing hdmi circuitry off the iPhone reduces cost of the device, increases cost of the dongle, and drops signal quality. We can also speculate that signal quality will increase over time by a combination of better encoding, pass through for already encoded signal, and raw speed upgrade for lightning protocol itself.
What is puzzling is the timing of the release. There is an obvious drop in quality, and no obvious reason to save cost in iPhone 5. They could have let the technology mature to quality parity and only then release it, so what gives? My theory is that Apple came under serious price pressure, as cheap smartphones are now the fastest growing segment of the market, and so they are preparing to ship a very cheap version of the iPhone. Given their position as high-margin company on one hand and pressure from low-margin competition on another, they felt the need to pinch every penny. And so the iPhone 5 ended up being the test bed for the new wave technology with much lower cost of entry, but more expensive accessories. Unfortunately, what we get in the interim are both expensive devices and expensive peripherals.
This takes a page from intel thunderbolt which uses active cables too. Initially thunderbolt cables were costly for this same reason but as chips become cheaper and smaller, cost would not be a issue while you still have advantage of a adaptive interface.
Lightning is an Intel feature that Apple is trying to support right? I mean I can't tell if they firmly support this or if since Intel is there only chip supplier they are kind of bound to supporting this.
No, lightning is a proprietary apple connector technology which looks like borrows same concept from thunderbolt which is intel's next generation connector.
It still doesn't make any sense. "Here's a world standard which is being built into literally every display on the planet, and is signal compatible with many others" (i.e. HDMI/Displayport/DVI).
And for some reason they're worried about some new standard being incompatible and not being able to upconvert?
The exact same logic would apply if the devices had native display connectors, and then eventually needed some new connector for it. If you can't get a raw framebuffer or HDMI over your current link, you're never going to get a new faster/bigger protocol over it either.
This isn't just about video, the design helps to keep pin smaller and removes the need to having dedicated combination of pins to support a certain standard which may not be used anymore. For example, old 30 pin connected had support for FireWire which is no longer user anymore but connecter still needed to have those pins.
I think they know the issue of video quality and are working to fix that. The beauty of this design is they can improve the hardware/software to push data efficiently because it just needs to output data stream it's upto the cable to handle it. As Moore's law kicks in, cables would need smaller chips.
They no longer have to keep changing pin configuration to add support to newer connectors. The Cables will handle that in the software.
Except all it does is create a new problem for them, without solving an old one. If you're putting a SOC in the cable then it has to be powerful enough to talk to the new standard. So you need a custom cable and connector no matter what.
If you need a custom cable no matter what, then why not support a common standard - which is a standard and thus common to many devices - and then adapt it as necessary?
Instead you have this situation: you've got a proprietary standard which can't handle a common standard well in the first place (evidenced by the fact they made a compromise to make it work). It'd make sense if there was a lack of pins or something, but it doesn't - between USB and HDMI/DP there's every type of signalling you would need to support newer standards within the realms of the existing hardware.
It's Apple ecosystem lock in and that's it, but in this case its a worse outcome.
They're not going to be reprogramming iOS devices for faster bitrates - that means an IC change in the device. It also means an IC change in the cable. And at the end of the day still means...less capability then using standards would have.
>>They're not going to be reprogramming iOS devices for faster bitrates - that means an IC change in the device. It also means an IC change in the cable. And at the end of the day still means...less capability then using standards would have.
I am not sure the assumption that it would need a new chip is true. It may be a firmware issue that can be fixed in a software update.
>> It'd make sense if there was a lack of pins or something, but it doesn't - between USB and HDMI/DP there's every type of signalling you would need to support newer standards within the realms of the existing hardware.
Within in realm of existing hardware is the key point. 30 pin connector was used for 10 years. In those 10 years there has been huge change in Standards used and adopted. I would assume Apple has similar plans for Lightning.
Also USB and HDMI is not a perfect standard as you seem to imply. The docking port on iDevices is used for many other things including things which haven't been invented yet.
Standards like Micro USB 2 are non starters for Apple. With only 5 pins: +5V, Ground, 2 digital data pins, and a sense pin,most of the dock connector functions wouldn’t work – only charging and syncing would. Micro USB 3 is capable but Larger than Lightening. Also implementing Micro USB 3 would require a Chip on the host and also handle USB protocol on processor using precious PCB space. Also implementing HDMI on Micro USB needs a special convertor chip.
Also many devices break USB spec for allowed power to charge their devices. iPad with retina display, despite going over limits takes forever to charge. On lightening, device can multiplex all 8 pins to charge the device.
I agree with your point of controlling the peripheral market.
This isn't as much of a compromise as what would needed to be done in case of Micro USB. MHL does same thing for Micro USB where a separate controller chip on the cable repurpose signals to HDMI while USB is off.
Ideology has trumped engineering, and as hackers, you shouldn't tolerate it.
Frankly, all of this has been obvious all along to any competent engineer, since the moment Apple introduced lightening. They described it as a serial bus and talked about how it gave more flexibility. If you think about it for 2 seconds its obviously better to run protocols over a serial bus than to run 30 lines of individual signals, with dedicated lines for analog and dedicated lines for digital in a world where people want HDMI adapters for a connector that originally had firewire signals on it, from a time before HDMI was even common.
But this is Apple, so the REAL reality distortion field kicked in-- a tsunami of press acting as if Apple was ripping people off with a $30 adapter, hundreds of mindless conspiracy theories from Apple bashers on Hacker News about how this is to have more control over people and how this once again proves that "open" (defined as google, not actually anything to do with openness) is better than "closed" (defined as Apple, you know the company with the most popular open source operating system in the world?).
It's one thing to not know enough engineering for this to have been obvious to you, it's quite another to spread lies and engineering ignorance as a result of your ideological hatred of Apple. And the latter is basically all I saw on HN about this format. (Which is one of the reasons I write HN off as worthless for months at a time.)
What you're saying is true from an engineering standpoint (serial vs parallel), but has to be placed in the customer's context.
In this specific case the quality is bad, operation is unreliable, and the price is high. Consumer devices accept HDMI as input. Serial to parallel video (Lightning to HDMI) is tough without some heavy-duty hardware -- hence the exorbitant cost of these adapters.
The SoC design introduces a massive amount of complexity. This has yielded unreliable operation. And it introduces that complexity at a point of physical vulnerability -- people don't treat adapter like tiny fragile computers. They treat them like, well, adapters.
End-to-end serial communications would be nice, but that's not the world we live in.
Lightning isn't that much smaller than HDMI or Micro-HDMI. Reversibility is a very minor feature, and not worth the price being paid.
And that's not a $30 adapter. It's a $50 adapter. Did you think it was $30? That was the old one -- parallel to parallel.
After thinking a little bit about it, I think this approach does make some sense and allows for more flexibility in the future. Keep in mind Lightning was likely designed to last well over a decade and will be used in many different devices.
Now since the adapter is a SoC and it's OS is booted from the device, what that means is, every device has essentially full control over how it wants to output HDMI, without having to change the adapter or the port. Right now this is accomplished using this quirky h.264 encode/decode workaround, but this is first-gen, and it doesn't have to stay that way. Future iDevices might load a different OS onto the SoC and output lossless 1080p using the exact same adapter! And without breaking older devices.
It frees Apple from having to define a fixed method of transmitting HDMI over Lightning now, that is then set in stone for the next 10 years, and has to be supported by every future device.
It also frees them from having unnecessary pins, which might become useless in the future, but have to carry over to every new device (a.k.a. 30-pin connector). And knowing Apple, probably THE top priority of Lightning was to have a slick, easy-to-plug-in-n-out, user-friendly connector, which Lightning admittedly does way better then any MicroUSB standard.
Because in essence, the only thing that is fixed about Lightning is the physical shape and pins, so they focused on getting that aspect right and future-proof. How the data is transmitted can be changed on a device level basis.
The problem isn't even serial-to-parallel - HDMI is serial based - the problem is that Apple apparently designed Lightning with insufficient bandwidth for uncompressed video, then kludged around it. Then Apple fanboys went on and on about how much more elegant it is than MHL, which has much cheaper HDMI adapters and better video quality because all the MHL adapters have to do is convert one uncompressed serial video data format to another.
I mean, technically speaking Samsung or any of the other manufacturers could've done the same trick as Apple using plain old micro-USB OTG 2.0 with no special hardware support in their phones, no special connectors... but the reviewer community would call them out on it because it's ugly and user hostile, if their engineers even let them get that far.
I strongly disagree that reversibility is a small feature. Whenever Plugging the chargers to new iOS devices is effortless the same way as headphones jacks.
Non-symmetrical connectors are an affront to usability.
I agree. I don't even have a device with the connector (yet?), but it seems like a major advantage.
Who are all these people popping out of the woodwork wanting a wired connection from their phone to their TV? I'm sure some people do this sometimes, but so many? Why would you even do that? Perhaps this is uncharitable, but it makes me think that most of the people complaining here have never done it, never will, probably never even thought about doing it before, but are now outraged at the thought that the connector is not 100% perfect for this one uncommon use-case.
The only two use cases I can think of is playing a video you recorded on your phone at a family gathering, or for playing Netflix on your TV without needing to hook up a Roku or similar.
Edit: third use case, hooking this up to a monitor to turn your smartphone into a desktop computer.
Thanks, that's the first use case that I could actually see using myself. Don't think it's quite enough to get me to go buy a cable, but I can see it being handy for that.
It's especially strange because if you want to do that, you can Airplay the video to the TV, and not have to deal with a cable from your phone to your TV.
Asymmetric connectors weren't bad, it's when you have asymmetric connectors in rectangular plugs that it becomes a problem. I never tried to put FireWire 400 in backward, but USB is awful.
"And it introduces that complexity at a point of physical vulnerability -- people don't treat adapter like tiny fragile computers. They treat them like, well, adapters."
I don't understand this. What makes these things any more fragile than a regular adapter? They are, as far as I understand it, compact, fully solid-state, and about as strong as any consumer electronics of that size would be.
> defined as Apple, you know the company with the most popular open source operating system in the world?
I'm not sure which OS you're talking about...perhaps you could point me to the source code of either OS X or iOS? Certain core components of OS X are open source, but Darwin isn't OS X.
As someone who makes his living from writing Objective-C code, I don't have any ideological objection to Apple. But I think you shouldn't accuse people of spreading "lies and engineering ignorance" when you seem to be claiming something that's patently untrue.
Perhaps you didn't experience the PR storm around the time OS X was initially released: it was heavily geared towards nerds and the literal phrase "open source" was extensively employed in their marketing material.
This aside, you're basically trying to write off nirvana's (IMHO excellent) rant using a minor technicality, one of the common features of the discussions here that tends to make my skin crawl.
you're basically trying to write off nirvana's (IMHO excellent) rant using a minor technicality
He's not trying to write anything off and it is a pretty big technicality. nirvana should have omitted that ideological jab to begin with as it was unnecessary and shows his bias for Apple/against Google. Google is more "open source" than Apple in any way that matters, considering Android is currently on devices their competitors (Amazon) are selling. Apple is mostly responsible for WebKit which is commendable and useful but as far as practical considerations go, nobody gives two shits about Darwin.
But yes, engineering should be the focus and people assume the worst with Apple.
>But yes, engineering should be the focus and people assume the worst with Apple.
Some people sometimes assume the worst of Apple, or Microsoft or Google, or [insert name of company here]. One of the things that can cause strong anti-Apple sentiment are the rabid fanboys (ie. postings like Nirvana's). They paint Apple to be patron saints - and when reality hits (like it did for me with antenna-gate), users are annoyed because of the unrealistic expectations, but also because of the RDF created by Fanboys.
For the record, I think Apple have shown the phone industry a thing or two about engineering excellent products while maintaining a strong focus and excellent compromises. I just wish the rabid Fanboys would shut up, or present a balanced view ... it would make Apple a lot easier to respect.
I would tend to agree wrt HN nitpicking except in this case nirvana is presenting himself as the paragon of objectivity setting the story straight against the unwashed rabble of knee-jerk Apple haters, when in fact he is nowhere near objective when it comes to Apple, and shooting ignorant fish in a barrel is not good enough to validate his points. He needs to be held to a higher standard.
The technicality being that Darwin seizes to be an open source operating system once it is shipped with closed source components. Why would this be true?
OK here we go, and why not, after all it's Sunday and I've got nothing better to do.. right?
The central tenet was that discourse here is regularly devoid of sound engineering because it tends to be blinded by mindless cultural perceptions of the companies involved in whatever happens to be under discussion. In the case of Apple the expectation is their products are flawless and if not then all hell will be paid on blogs and comment sections everywhere.
Whether or not Darwin is or isn't open source doesn't freaking matter, it was heavily marketed as such back in the sands of time and even if this wasn't the case it doesn't invalidate the central point made in the rant - that just because this device has an Apple logo every popular discussion surrounding it turns to mindless diatribe as a result of non-engineering centric expectations people place on their products, and every engineering-centric party (i.e. hackers) must deal with the whining polluting engineering-centric forums for days every time it happens.
In effect, the complaint is that commenting resembles the squabble of a throng of uninformed consumers rather than the judicious discourse of a forum of engineers.
I remember back in the early days – from Mach on black hardware through Openstep on 4 different architectures – the folks from NeXT were always very careful to use the phrase "system software" when referring to the whole thing and only using the phrase "operating system" when referring to the layer that supports basic functions like scheduling tasks, invoking user-space code, and controlling peripherals.
This is one of the things I appreciated of them back then, as they were respectful of the nomenclature actually used in computer science.
Now I realize that the phrase "operating system" commonly receives slight colloquial abuse to refer to everything inside the shrinkwrap, but I think the formal meaning hasn't completely died yet, so nirvana should be allowed to use it properly if he so desires.
Darwin is the operating system. Trying to point out that the whole system stack is considered not open source because the windowing system isn't open has nothing to do with it. Does my Ubuntu system become not open source when I use the binary nvidia drivers?
So what does Lightning Digital AV Adapter do today that could not have been done using Micro USB->MHL? Looking at the reviews the Apple solution is a) pricey b) over engineered and c) performs worse than MHL. And you are talking about Apple haters creating a reality distortion field?
One way you could defend it is to promise features that can be programmed into the adapter firmware but if today it does 720p poorly I see no reason to believe something much more useful/better will come later. I am paying the $50 today, not in the future.
> The Samsung Galaxy S III and Galaxy Note II use a connector that is similar to the original 5-pin MHL-HDMI adapter/dongle, but it uses 11-pins in order to achieve a few functional improvements over the 5-pin design
Both Micro USB and MHL seem to be industry standards - ref Wikipedia. Lot of vendors seem to putting out cheaper Micro USB to MHL adapters - so even if (Micro USB->MHL) may not be a standard it is at least built on two standards and proprietary licensing / approval seems to be unnecessary for vendors.
Also most MHL adapters seem to do 1080p - the Lightning one seems to do 720p badly if we are to believe Apple store reviewers.
The sad thing is that Lightning is even inferior to the Apple 30 pin connector for this use. You get better video out on an iPhone 4S than a brand new iPhone 5, and the connector is $10 less and has a pass thru so you can charge it at the same time.
This seems like the kind of solution you would bodge together if someone gave you the already designed Lightning connector and said "now make it support VGA and HDMI". A completely crazy hack. Bashing Apple over it seems completely reasonable.
I have always felt this way, I just don't feel a need to make a noise about it.
Be wary of confirmation bias and sample set bias (you only hear the worthless noise from those who are speaking it) when reading sites like HN/Reddit/etc.
It's a lot easier to hit the upvote button than it is to type a comment. Not all of HN's constituents are whiny blowhards.
I agree 100%. Software is eating the world, so I'm not sure why everyone is so against this.
Maybe it's a case of I didn't complain when they came for my DEC Alpha server with green screen, nor when they took away my Token Ring network, but I will not stand for only using one flimsy cable for all my devices. Come on, this is the tech industry, what did you think was going to happen?
What I see Apple's done here is future proofed the connector. Ok, so it doesn't output 1080p today, but I see no reason why it couldn't tomorrow. Devise a new protocol, download an update to all the iDevice's which in turn upgrades all the adapters out there and everything's golden. Once this (admittedly painful) transition is complete, I see no reason for Apple to have to endure another one. By the time it's outdated, I'm sure everything will be wireless.
Perhaps everyone complaining about a $30 adapter shouldn't have purchased a $600 phone and instead stuck with a $20 Moto Razr.
I do see where you're coming from, and agree it sucks that they've went backward in quality in this case. To my mind those are implementation details that Apple screwed up. It doesn't invalidate the basic idea though, which is to move the brains of the device into software so that the same connector can be used for a multitude of different functions, some of which don't even exist today.
Admittedly, I'm not privy to whatever design decisions the team that implemented the connector made, but I see no reason why it couldn't have the same fidelity that a straight hdmi cable would have. If I guessed, I'd say that they said 'good enough, ship it' instead of continuing to refine it since they knew they could always send down an update later.
The point here is that the electrical design of Lightning doesn't have the bandwidth for 1080p. It runs at USB 2 speeds.
Software can't magically make hardware do things. It can create an illusion (like the Lightning adapter does), but the design has to support, for real, capabilities you want to properly provide.
I agree with what you're saying, but I'm interested about Apple's open source operating system. Assuming OS X, I'm having trouble finding its entirety in open source, obviously it's built on open source but I don't think the entire OS is open sourced. I'm also having trouble believing it wouldn't have been used for a Linux desktop by now.
Sorry for replying to a small part of your comment, but I really do agree with the rest.
Can you explain a little more what you're getting at? I'm not very familiar with hardware and don't know anything about video. The adapter seems like a neat hack, but definitely a workaround for something. I can't really tell if you're defending it or not.
What's the win from an engineering standpoint here? And why is this an inevitable design (which you suggest if I understand you right)? What are some other options and what are the reasons those might not have been used?
Are you seriously defending Apple by saying a serial bus is the obvious solution? Apple, the company that has forced rediculous proprietary ports that add zero value into its products while every one else in the space was using Universal Serial Bus? If this post was meant to be sarcastic then I will accept a well deserved whoosh, but ideology trumped engineering at Apple years ago.
I think you are wrong about that. Apple shipped the iMac starting with USB v1.1 in 1998 [1]. It was the first Mac with a USB port. USB was developed in 1994 from a group of seven companies Compaq, DEC, IBM, Intel, Microsoft, NEC and Nortel. [2] Windows 95 that was released in 1997 had built-in support for USB devices.[3] The market share of Mac at that time was about 4.6% [4]. So no Apple wasn't the first computer to ship with USB support nor was it the reason USB went mainstream.
“Few USB devices made it to market until USB 1.1, released in August 1998, which fixed problems identified in 1.0, mostly relating to hubs. 1.1 was the earliest revision to be widely adopted.”
The iMac G3 was released in August 1998. I didn’t say the iMac was the first computer to have USB ports, because it probably wasn’t quite the first (although, interestingly, no other computer comes up when you try to Google this); importantly, though, it only had USB ports, and killed off ADB, serial, parallel, and SCSI, forcing users to start buying USB peripherals. My family had to get a serial to USB adapter that still worked with OS X the last time I tried it with a GPS receiver (I just looked it up and it may have finally stopped working with Mountain Lion, nearly 15 years later). It was, what, about ten years after that that most PCs finally stopped including PS/2, serial, and parallel ports?
Someone complains that Apple promoted FireWire and was late to support USB 2.0, but FireWire came out first and was technically superior (although some devices do have weird compatibility issues), and that Apple dragged its feet supporting USB 3.0 because they were trying to promote Thunderbolt, but I believe this was because Apple is using Intel chipsets (because Intel killed off Nvidia’s chipsets) and Intel was doing exactly what this person accused Apple of doing.
USB was around long before Apple stopped providing PS/2.
If you buy a phone or tablet from anyone that isn't Apple you will very likely get a USB port. If you buy Apple you will not. Trying to argue that Apple is somehow looking to the future by providing a serial bus years after it was the norm is hard to comprehend.
But there could be. In fact, it could do almost anything as its a powerful little computer in its own right and it sounds an awful lot like it boots the kernel and then runs a daemon in a most familiar fashion. I can think of a few delightful applications off hand. There will be some epic hacks. Is there like a "Bunnie" signal?