TECHNOLOGY CAN be a minefield at the best of times, especially for the "others". You know the ones. The ones that regularly ring you for tech support because "you know about this stuff".
For years now, we've been dangled the carrot of standardisation. Cables, operating systems, memory cards - all those complex cogs in the machine that make the big stuff work.
And yet, it's 2019 and the whole system seems to have broken down. Everything goes up in increments, and we're now further away than ever from being able to borrow your mate's charger than ever. The advent of microUSB sounded like it would be the beginning of something beautiful, and it's still the most consistent option we have. But what about some of those other USB formats? USB-B. microUSB-B Superfast, USB A with an orange lip? What does it all mean?
When the USB-IF group announced last week that USB 3.2 would be rebranded with meaningless names like "Superfast+", it became clear that people in the tech industry have no idea how to relay this stuff to the public.
USB-C is the latest standard of connector and it's designed to replace everything - it can carry USB 3.2, Thunderbolt 3, HDMI, DisplayLink and deliver power on a single cable. But it's an absolute mess. Does the machine carry full duplex Thunderbolt 3 or half duplex? Can your converter to a standard HDMI handle plug and play? Is the data transfer USB 3.2 gen 1 or USB 3.2 gen 2?
It not only doesn't make sense, but it shouldn't matter - people just want to be able to pick a cable and go, knowing it won't fry their machine. Recently announced is USB-4 speeds up connections using Thunderbolt instead of the traditional USB protocols. So isn't that now a Thunderbolt cable? Technically, no it's not - it wouldn't be backwards compatible if that were the case. But the point is….
WHO CARES? You shouldn't have to.
HDMI is similarly fragmented; most HDMI cables aren't even labelled with which version of HDMI they are, and it matters. If you use a cable earlier than HDMI1.4a, you won't get 3D rendering. If you use below HDMI 2.0, you won't get UHD. And then there's the interfaces - HDMI, mini HDMI, and micro HDMI.
Bluetooth has always maintained excellent backwards compatibility, but when you dig down, that Bluetooth 5.0 device connects with Bluetooth LE but uses Bluetooth 3.0 to do its thing. It's a minefield.
But perhaps the worst of the lot is the SD card. We've had full-size SD cards, miniSD cards, microSD cards and now Huawei is going rogue with NM cards.
Look at a microSD card and you'll see a variety of specifications, all of which will dictate if its fit for purpose. It used to be whether it was Class 2,4,6,8 or 10, representing the data transfer speed. These days, almost all cards are Class 10, but rather than create a Class 12, 14, or even 20, a whole new series of standards have emerged. For a start, some Chinese companies still refer to their old name - TransFlash (TF) or even Multimedia Card (MMC) when they mean SD card.
And "compatible" just means the card can be read and formatted - some devices still have a limit on how many files they can read, even if the card can store more.
Then there are the sub-classifications. A microSD card could be SD, SDHC, SDXC (the most common currently, SDUC, SDSC or even SDIO. What does it all mean? Then there's the newly announced SD-Express which promised amazing things but will require compatible devices.
SDXC cards that support the UHS (ultra high speed) standard can be UHS-I, UHS-II, or UHS-III. Their ability to record video is measured with another code - V6, V10, V30, V60, V90. If you want to run apps on your phone, you'll need to know if the card is A1, A2 or A3 (or none of them).
Finally, depending on what your card clocks in with all those other "standards", will dictate the default file system - FAT, FAT16, FAT32 or ExFAT.
It's exhausting. And consumers are left scratching their heads and wondering what hell it all means when the fact is that they shouldn't have to.
The bottom line is that before you can think about cables, you also need to know what's the maximum standard that the host device can deal with, and more importantly, the minimum cable/card you need in order to make the best of its features. And of course, if buying a high capacity card is actually worth the extra if your device can't process that much information.
This column isn't designed to be an explainer - it actually raises far more questions than it answers and we admit it, but that's sort of the point; it's not our job to explain all this nonsense. What needs to happen is for all these standards to be explained in a proper, standardised way that's relatable to the public. Perhaps colour codes are the way forward - USB-A cables carrying USB 3.0 and above have a blue lip and that really helps, though it's far from definitive.
With the rollout of USB-C being the antithesis of a unified standard (mostly because small Chinese firms vying to be first to market, ignored the final specification) something needs to be done - not with meaningless brandings, but with semiotics. Otherwise, we'll just end up with our holiday videos with dropped frames, all unable to be viewed in their native 4K, and when you plug in your laptop to check what's happened, a power spike fries it.
Alternatively, maybe there's an evening class. μ
Some deliberately, others through stupidity
Quite the business expense
It's another quantum leap camera
Evolution, not revolution, but that's just fine