Technology is advancing so fast nowadays that it’s important to look back and remember how far we’ve come. From your smartphone to the TV in your living room, let's delve into some of the past and present technology then and now.
Data Storage
We couldn’t always rely on the digital realm to store everything from photos to messages. Data storage originated in the textile industry during the 18th century, when "punch cards" were used to record series of instructions for mechanical equipment like Jacquard looms.
Those paper cards were later used to input data by hand-punching holes and feeding them into a card-reader, which converted specific sequences into digital information. Holes punched on each column would represent number characters in a standard binary code, and entire programs could be represented as long as the stack of punch cards remained in order.By the 1960s punch-card methods were far less common, and in 1956 IBM also debuted the world’s first hard disk drive,
the IBM Model 350 which weighed over a tons, contained 50 24” storage disks worth a grand total of 3.75MB and was leased to companies from $3200 per month or $30,260, post-inflation.
A decade later that hulking machine was superseded by a new age of data storage: the floppy disk. Those thin, flexible disks arrived in 1971 in 8” form and, later 5 ¼” and 3 ½” and were written and read using a floppy disk drive. The original 8inch floppy disk had a storage capacity of about 80kilobytes, meaning you’d now need about 50 disks to store one 4-minute song worth an average of 4Megabytes but by 1986 IBM introduced the 3 ½” version capable of 1.44megabytes.
1999 saw the arrival of the impressive and super-compact SD memory card from SanDisk, Matsushita and Toshiba, and in 2000 people were first introduced to the familiar USB "thumb" drive initially capable of 8MB storage. Nowadays USBs can support hundreds of gigabytes of data, and there’s even a microUSB smaller than your finger and the thickness of a needle that can hold a whole terabyte. That’s 40 hours of 40K video!
For comparison, you’d need 266,666 IBM 350s to hold that! Most people now rely on the digital capacities of "The Cloud" first introduced in 2007 which currently holds an estimated 1 ‘exabyte’ of data: the equivalent of over 1 billion Gigabytes. As Moore’s Law states that digital storage capabilities and speed doubles exponentially every two years, who knows what’s to come?
Mobile Phones
In the modern world smartphones are treated as a necessity, but the history of the mobile phone is a surprisingly short one. In April 1973, Motorola researcher Martin Cooper made the first cellular telephone call to announce that Motorola had won the mobile technology race. His device had a single-line text-only display; weighed 2.5lb, was 23cm long and only had a talk time of 30 mins while taking some 10 hours to recharge.
On September 21st, 1983 the first commercial mobile, the Motorola DynaTAC 8000X became available but still was a long way off reaching the masses. The device cost an eye-watering $4000 over $10,300 post-inflation and had a measly 20minute call time, despite the battery weighing 4 to 5 times more than the phone itself.By 1992 the mobile phone was no longer reserved for business use or as a major status symbol, and the first to take advantage of the transition to digital consumer handsets was the Nokia 1011, which hit stores in 1992. That world-first mass-produced mobile phone could hold 99 phone numbers and was the forefather of the iconic Nokia 3310 model, which had SMS capabilities and helped the company dominate the market after its debut in 2000.
It wasn’t until 2007 that Steve Jobs and his collaborators transformed the industry again with the first-generation iPhone which boasted a 3.5” screen, 8hour talk time and swanky new application features. Nowadays there are over 150 mobile phone manufacturers in the world, but Apple is still considered a key innovator with the recent iPhones boasting a 6.1” ‘liquid retina display’, lightweight 0.4lb frame and upwards of 256GB storage capacity.
Reading
Reading is a historically valued practice it can improve our worldview, mental processing and education but much has changed over the short of modern digitization. A few decades ago, picking up a book and reading it front to back was the primary way to consume narratives, but as technology has invaded modern households reading has been transformed by laptops, smartphones, tablets and especially e-readers.
In 1997, leading innovator E-Ink developed a revolutionary electronic paper display technology which allowed a digital display screen to reflect light like ordinary paper without a backlight. That wasn’t widely adopted until Amazon founder and CEO Jeff Bezos commissioned the world’s best e-reader in 2004, and the race to revolutionize reading was on.As it turns out, Sony was the first to use E-Ink technology in their Librie and
‘Sony Reader’ e-readers in 2004 and 2006, but the concept didn’t fully take off until Amazon announced the first-generation Kindle in 2007. That compact reading device incorporated E-Ink into its 6-inch display, featured 250MB of internal storage the equivalent of some 200 titles and sold out within 5 and a half hours.
By 2015, eMarketer estimated that there were around 83.4million e-reader users in the US, and in 2018 e-books were expected to make over 50% of consumer publishing revenue in the US and UK. Amazon’s newest water-resistant Kindle Paperwhite model now has a minimum 8GB storage, meaning it can store up to 2000 titles. Considering some 1400 books weighing about 1lb each would be the same weight as ½ an average car, that’s an amount of literary potential to have at your fingertips.
Computers
Without basic computing modern tech like smartphones wouldn’t even exist, but it’s been a long road to reach that point. In 1880, punch card based machines were relied upon for computing the US census results after population growth meant it previously took several years by hand, but the true ancestor of modern computing was Charles Babbage’s Analytical Engine prototype.
Although never fully operational, that steam-driven engine marked a transition from mechanical calculation to general computing, and over a century later in 1941 the first general-purpose ‘Z3 computer’ was developed by German Scientist Konrad Zuse. Intended to calculate aerodynamics in aircraft design, that machine weighed a tons, still relied on punch-tape for data storage and was only capable of basic mathematical functions with a top processing speed of just 5-10 Hertz.
The first mass-market desktop computer, the Programma 101 wasn’t unveiled to the public until 1964. Priced at $3,200, or $26,550.50 inflation-adjusted, that popular 65lb device made computing compact and accessible but could only support basic mathematical functions with a miniscule 240 byte memory, that means it could only store about 0.006% of an iPhone photo.Apple’s landmark Apple 1 computer which resembled a circuit board and required buyers to purchase parts like a keyboard and monitor arrived in 1976 for $666.66, or $3013.54 post-inflation. The system supported 4KB of storage and a 1Mhz 6502 Central Processing Unit, which is 4400 times less processing power than the recent Apple Mac Pros with 4.4GhZ.That desktop also boasts a whopping 1.5Terabytes storage, you’d need 18,750,000 8” floppy disks to hold that! The Mac Pro is far from the most impressive modern computer on earth, though. IBMs ‘Summit’ OLC-F is the currently
the most powerful supercomputer in the world.
It has a storage of 250Petabytes that’s 250,000 Terabytes, or 250,000,000 Gigabytes and is so fast that it’s measured in floating point operations per second or FLOPS. The Summit is capable of a peak 200petaFLOPS: to give you an idea of how much processing power that is, it recently became the first supercomputer to reach "exaop" during genomic analysis, meaning it processed a quintillion operations per second!
Laptops
As desktop computers made their way into offices, industry pioneers focused on compacting computing capabilities into something more convenient. The first truly portable computer the Osborne 1 was created by designer Adam Osborne in 1981 and was about the size of a portable sewing machine, weighing some 24lbs.
The revolutionary luggable device had a 5” screen, two 5 ¼” floppy drives, a modest 4Mhz CPU and was priced at $1795 about $5080, inflation-adjusted. In 1986, IBM then announced the first real laptop computer, the $2000 $4,700 post-inflation ‘PC Convertible’, which had 256KB storage and a slightly more improved 4.77MHz CPU; but which could also crucially run on battery power.
Apple then entered the laptop market with the first Macintosh Portable in 1989, which had a 40MB hard drive and 16MHz CPU, but it was considered one of their worst products to date. Not only did it weigh 16lbs, 3lbs heavier than the outdated 13lbs IBM PC convertible and the same as about 5 modern MacBook Airs, it came with a hefty price tag of $7300, nearly $13,500 post-inflation, and could barely store a few standard photos at an average of 6MB each.
Thankfully, the revised ‘Macintosh PowerBook 100’ model of 1991 fared much better with its 16MHz CPU, expandable 40MB hard drive and more lightweight 5.1lb frame, but it still pales in comparison with modern tech. The 2019 16” MacBook Pro priced at $2,399 for example, has up to 8terabytes of storage capacity 100,000 times more than apples first laptop and runs using a special 8-core 2.4GHz processor, meaning it is some 240 times more powerful than the original Macintosh PowerBook 100.
Calculators
If you need to solve something today you probably wouldn’t think twice about taking out your smartphone, but once upon a time it took great amounts of machine processing to perform the most basic calculations. Back in 1820, a mechanical machine now hailed as the first incarnation of the modern calculator "The Arithmometer" was patented by Thomas de Colmar and later manufactured between 1851-1915.
To use that desktop-sized device, you’d simply select the desired mathematical function on the reversing lever, set your numbers using cursors with scales 0-9, and then use the crank which would display the results in the ‘totalizer’ windows at the top. During WW2, more complex calculating power was required for military precision, so a bigger, better machine was built to take on the job.The ‘Electronic Numerical Integrator and Computer’ or
ENIAC was the world’s first general calculating computer which was built in 1946 at a cost of $487,000 over $6million post-inflation by Pennsylvania University Students. That machine could solve a larger class of numerical problems and was about 1000 faster than electro-mechanical computers but there were a few drawbacks.
ENIAC required over 17,000 vacuum tubes, 70,000 resistors and 10,000 capacitors and weighed about 30tonnes, filling an entire 30x50foot room and consuming an insane 150-200kilowatts energy. It wasn’t until 1967 that the first portable calculator known as the Cal Tech was produced by Texas Instruments which printed results and was capable of basic arithmetic using 4 compact integrated circuits.
By 1980 pocket calculators had reached the forms we still recognize today with battery-power and using single chips and LCD displays. Nowadays even the most scientific calculators can be replicated using software, and smartphones come with built-in calculating capabilities.
TVs
Television as a concept had been discussed as early as 1900, but it wasn’t until 1924 that the first real steps were made. Scottish engineer John Logie Baird developed a way of passing a beam of light through a rapidly spinning disc punched with holes, so that a simple image could be scanned, transmitted and reconverted.
The first mechanical TV sets of the 1920s such as the Baird Models B and C, initially resembled large cabinets with a tiny display with primitive progressive scanning technology inside. The earliest adaptations of Baird’s protype were mechanical rather than the digital technology we’re used to now and relied on scanning vertical lines, achieving just 30lines per frame.For some perspective, turn the video below down to 144p, which scans 144lines per-frame the earliest screens were about 5 times worse, which explains that terrible picture quality.
Televisor 3 - Mechanical Television by TelevisionDk From the 1930s, Marconi EMI’s far-superior electronic television sets appeared, those were fitted with cathode ray tubes, could produce 405 scanned lines 13.5 times more than Baird’s mechanical TVs and could now be measured using digital pixels. TVs like that 1948 Admiral Model 19a12 costing $165.95, over $1800 post-inflation, now had about 503x77 pixels in each frame and could produce 25 frames per second.
After the advent of color TV in the 1960s, the next big development was the replacement of cathode-ray technology with LCD display, which made TVS thinner by using liquid crystals instead of projecting images by splitting audio and visual signals into red, green and blue lights. Pixel density measured in megapixels per-frame, which is 1million pixels is the best way to really see how far TV quality has come.With its 503x377 display, the earliest electronic TV had about 189,631 pixels per frame, which is over 5 times less pixels than a single megapixel. For some perspective, standard 1080P HD TV has a 1920x1080p display, containing 2,073,600 pixels per-frame or 2.07 megapixels, that’s almost 11 times more pixel density than the 1930s model.Even more modern 4K OLED UHDTV starts at 3840x2160p or about 8.5MP per frame, which is almost 44 times more than its ancestor, while the super-impressive 8K UHDTV’s 7,680x4320p display has about 33.2MP per frame, that’s nearly 175times more pixel density than the original electronic sets! Who knows how we might watch our favorite shows in the future, but the LG rollable TV released in 2020 is certainly a good indication.If you were amazed at this analysis of technology then and now, you might want to read about
how far science has come. Thanks for reading!