
Over the last year, I’ve written about various technologies and tools that are used in today’s industries. Having seen and experienced firsthand the different levels of our understanding regarding the technology that exists today, I thought I’d attempt to present it to everyone in my own simplified, layman (some might say dated) terms of understanding. Having transitioned from an “I.T. guy” to an Operations professional in the early part of this millennium, I can relate quite a bit to both sides.
Here we go. I have taken liberty with some of the timings to try and make a point (humorous or otherwise). Healthy debate is welcome, as it will help my understanding of current technology trends, insults not necessarily so. 🙂
Way back when, in the 20th century, we had computers, networks and servers (anyone remember Banyan Vines, Novell Netware).
Today, it’s all about the internet and the cloud (and/or the edge), which are terms for remote servers (storage and computing power) not on-site to the company or individual.
We connected to our school or work networks using dial-up modems (300 baud, 2400 baud, 14.4k baud if you were lucky), assuming the phone line wasn’t already in use. People would curse when they heard that line crackling noise and lost connection, especially during a large FTP transfer, knowing they might have just lost their last hour of work.
Now it’s a VPN (a virtual private network, as it reads) accessed over a high-speed (broadband) connection, such as DSL – digital subscriber line (phone) or cable or fibre-optic. Beyond wired connections, there are the latest 4G or 5G wireless networks. I won’t highlight all the differences (that’s another article), but cabling type, technology, frequencies, and different transmission protocols all impact speed and latency (lag or delay). With higher speeds (and expectations), people now curse when they get the ‘connection timed out’ error and have to try again *2* seconds later.
In the 1900s, we owned Atari, Commodore, NeXT, Apple, IBM, Compaq, and many other “PC clones”.
This century, it’s about Lenovo, Acer, ASUS, and their smartphone relatives, Samsung, LG, and countless others. There are some constants here. Yes, I’m waving at you Apple, HP and Dell.
We dialled into bulletin boards then (CompuServe, AOL), now we connect to the world (wide web), machines, appliances, friends, family, and companies, with barely more than a click or two of a button or on a screen.
Pre-Y2K, we programmed in Assembler, COBOL, Fortran, Basic, Pascal, C and C++.
Today, we “code” in JavaScript, Python with some PHP as well. And yes, I know C and C++ are still around. Where is my Borland manual when I need it?
Pre-2000 computer departments were referred to as either I.T. (information technology) or I.S. (information services and/or systems) and often reported into Finance or Administration VP’s.
Departments now fall under either a CIO (Chief Information Officer) or a CDO (Chief Digital Officer) or even a CTO (Chief Technology Officer). And there are usually subtle differences between each of them.
We received education and degrees in Math, Stats and Computer Science. Computer Science was often a complementary field to those who studied math or one of the natural sciences (Physics, Chemistry, Biology, Geology, etc.)
Now we study in fields such as Data Science that encompass areas such as Artificial Intelligence, Machine Learning, Deep Learning, Big Data, and Data Mining (plus many more which I can’t even begin to name). This could be a full blog on its own, and one that I likely wouldn’t do justice. Maybe I’ll try later in the year, but there are some great books out on the topic. For me, I equate Data Science to statistics but in a digital age.
Before 2000, data was stored in databases and spreadsheets, and we often had to extract comma or tab-delimited files to move between platforms and applications.
Today, we work with BI (business intelligence) tools that do most of the data gathering, analysis, manipulation and charting for us. And the data is as likely to be stored in the cloud as anywhere else.
Back then, our virtual world consisted of Zork, or maybe Doom (if we were of age). We could easily separate the two worlds.
Now our virtual (or augmented) world consists of wearables and devices that truly immerse us into a new reality that can trick the mind (yes, some easier than others).
Computers were computers. Some of them were “super”. Most were rather slow and big, but faster than the human brain and hand could work.
Right now, they’re all “devices” with varying degrees of size and power. Smartphones, smartwatches, tablets, laptops, not to mention day-to-day machines that we use (TV’s, robots, self-driving cars, appliances, homes). Much of the technology is smaller due to the advances in electronics, plus the remoteness of the storage. They still are faster than the human brain and hands working together.
In the last century, Artificial Intelligence was either a chess program (Deep Blue) beating the top grandmaster in the world or something dreamt about or seen in movies.
Today, AI, along with Machine Learning, means systems and robotics mimicking *and* predicting actions and thoughts people are having and acting upon them. They have reached a point of active use in industries like logistics (hopefully everyone has seen a video of an Amazon warehouse), health care, space exploration, customer service, and manufacturing.
In the 1900s, we spoke simply about automating processes and making the work easier for everyone. There was no revolution. It was called the age of computers.
Today, we speak about Industry 4.0 or the Internet of things (IoT) or the Industrial Internet of things (IIoT). Automating processes, at home, at work (any sector), places of service or travel. Essentially everything is in scope to maximize automation and connectivity between different “things” in our environment. In manufacturing, the use of smart technology is happening far more frequently – not surprisingly adopting the term ‘smart manufacturing’. Sensors can indicate pending machine failure using key characteristics such as temperature, pressure, and vibrational analysis. Other sensors can monitor the flow of work to ensure customer demand is being met. On the office side, we have BPA (business process automation) and RPA (robotic process automation) that are non-mechanical implementations of AI and process mining to help make workflow easier, be it on the customer service, sales or administrative side. Think call centers, technical support, recruiting, and accounting.
Whew! A lot has changed in a relatively short time.
What are the themes I take away from all this? The names and terms have evolved and advanced. Technology has gotten faster, smaller, more remote, and all-encompassing. Part of the advancement is our increased understanding and use of the sciences (math, stats, physics, chemistry). Some of it is human nature and our need to innovate and try new things to make the world a better place.
Technology is now fully integrated into our world, and it will take us for a wild ride in the next century. I am constantly reminded of two movies – Blade Runner and The Matrix; whenever I think about the technology transformation underway. We are getting closer to those worlds every decade.
To quote a famous GM commercial from years gone by, featuring “light years ahead of his time” Captain Kirk, this isn’t your father’s “I.T.” anymore.
Darren