Being involved with the IT industry for over the past two decades, I pondered over what great technologies are now rendered obsolete, even though were superior to competing technologies. Their demise was usually brought about through the mass adoption of an inferior alternative which is, more often than not, less expensive.
So this article is perhaps a way of me saying a last goodbye to these technologies:
When Panasonic announced in October 2013 that it was exiting the Plasma market, it was the end of an era. I am sure some manufacturers may continue to produce Plasma displays over the next few years, however Panasonic was the leader and champion of Plasma display technologies. Their focus is now on LED and OLED display technology, where OLED technology will eventually be superior to Plasma.
Plasma displays are superior to LCD/LED displays. LCD displays have being playing a game of ‘catch-up’ for quite some time. Usually the specifications of an LCD display would describe by how much the inherent issues with the technology had been corrected rather than describe how good the product is. I am sure we all remember early flat panel TV’s ghosting as players run up and down a pitch during a football broadcast?
For reasons unknown to me I found that the majority of my customers had a perception that LCD/LED TV’s were better than Plasma. I would give them at least 4 reasons why Plasma displays are better:
- It does pure black as a ‘colour’. No light comes off of true black areas on a Plasma. An LCD/LED display always have a light source turned on behind the LCD panel, and thus it is impossible for it to generate a pure black. This might not sound like a big deal, but there are several shades of grey leading up to black, in the case of an LCD display black would be a shade of grey which would effect the overall picture quality.
- The contrast ratio on Plasma displays is measured in millions, on LCD’s it is measured in 10’s of thousands.
- The response time on a Plasma is measured in 100th’s of a millisecond, on LCD’s it is measured in millseconds.
- The operational life span of an LCD/LED display is often a quarter that of a Plasma display
Plasma had a reputation for screen burn-in, this in reality is a problem for all displays, regardless of their technology (except OLED).
Plasma also had a reputation of being less energy efficient, this is possibly true, but only very recently. Plasma’s energy consumption is highly variable depending on the content being viewed, and thus could potentially consume less energy than a similar sized LCD/LED display. Plasma displays use more energy when display bright imagery, and use less when displaying darker imagery. LCD/LED displays consume energy at a constant rate regardless of the content being displayed.
The rise of LED/LCD displays has resulted in development of thinner and lighter displays, which is one significant factor that has brought about the demise of the Plasma display.
You can still purchase a Plasma monitor or TV, but when they are gone, they are gone forever. I suspect that Panasonic will stock their flagship professional monitors for select customers such as broadcasters, but as they have just mothballed their last Plasma factory in March 2014, this really is an end of an era.
Firewire is an ultra-fast data port used mainly for connecting storage devices to computers. It was streaks ahead of USB in terms of transfer speeds, and on some ports it was able to deliver more power than USB.
USB has pretty much wiped out the need for Firewire, despite the fact that USB has been playing catch up with the transfer speeds possible over Firewire. Recently USB 3.0 has started appearing on PC’s and laptops, pretty much ending the transfer speed arguments against USB. USB, despite its inferiority, has become ubiquitous it features on all manner devices including mobile phones, TV sets, audio devices and printers.
There is still hardware being manufactured with Firewire ports, but this is purely to support those who have invested heavily in Firewire technology over the past 2 decades.
I remember, not too fondly, when USB first arrived and Windows 95 promised us the arrival of ‘Plug ‘n Play’. Oh the joys of watching Windows crash when you plug in a printer, or unplug it for that matter. I remember cursing HP when they started releasing printers with a USB port and no Centronics parallel printer port. As USB began to feature on more and more devices and the software supporting it became more stable, my hang-ups over the technology began to fade. By the time USB 2.0 arrived, it had matured in to a stable and widely supported standard.
Apple adopted Intel x86 CPU’s, the fate of Firewire was pretty much sealed, although Apple still do feature Firewire on some of their products, but it is likely it will phased out completely as they update their iMac and Macbook offerings.
SCSI was primarily used as a means for connecting hard disks to computer systems, but it also supported other devices such as scanners. SCSI featured in file servers and as standard in early Macintosh computers.
For a long time SCSI had much higher transfer speeds compared to other data buses, such as the PC standard IDE. I had the further advantage of being able to support more devices on a single bus. When SCSI started, you could connect up to 7 devices, usually hard disks, on one cable, whereas IDE would only support two devices. The support for multiple drives on a single controller made SCSI the natural choice for server disk RAIDs, and later for Storage Area Networks (SAN).
With the launch SATA and the rise in popularity of USB, the advantages SCSI offered slowly became eclipsed by the lower costs associated with SATA. SCSI was (and still is) quite expensive and SATA isn’t, thus hardware manufacturers started making NAS and other storage technologies based on SATA, bringing about the demise of SCSI, at least for the layman. Nowadays the average PC motherboard will support at least 4 SATA drives, and will have RAID capabilities.
Variations of SCSI is still in use in very large data storage installations, known as Fibre Channel, which is based on SCSI principles, but this technology is really for the realm of the large Enterprise, and even then, I would say that its days are numbered as it is incredibly expensive, whereas the SATA based alternatives are not.
This is a network technology adopted by IBM and was used almost exclusively in organisations that built their IT divisions on IBM kit. It was fast, but more importantly, it was able to handle very high volumes of traffic without becoming congested or latent.
Ethernet, which is used practically by everyone nowadays, isn’t as adept at handling or managing large volumes of network traffic. Without going into the technical differences between Token Ring and Ethernet, the best analogy I can give on how the two differ is to imagine driving your car in two very different countries. Imagine Token Ring is a Western European country, whilst there is congestion on the roads, traffic does move, because there are road signs and traffic signals to ease congestion and prevent accidents – most people on the road understand the rules and there are strong disincentives for not driving properly. In comparison, Ethernet is somewhere like India, where there are few road rules and everyone drives pretty much the way they want to, because they take a fatalistic approach to driving – if it is your time to die, it is predestined, thus you don’t really need to pay due car and attention to the traffic around you; the result is lots of collisions.
Ethernet has become ubiquitous purely because it is substantially less expensive than Token ring, and in certain instances is less complicated. Over time network hardware has become cheaper and cheaper, and now a significant proportion of us have home networks built on Ethernet technology. Development on Token Ring ceased a long time ago, and thus, despite the chaos in how Ethernet manages network traffic, it has become increasingly faster and network hardware has become better at managing or preventing collisions.