The History of IT Management in 5 minutes

IT Management is infamous for requiring constant upskilling, adaptation and innovation. It's a role that is dictated by rapidly changing technological norms, and demands that even the most storied professionals either keep up with the cutting edge or fall into obscurity. 

Many of us who initially got into IT simply because we were good at computers may find that, between security policies, software licensing, networking, and project management, physical computers have turned out to be the least of our concerns.

It can be daunting to imagine where technology will take us or how it will shape our careers, but by taking a look at the way IT management has changed over the years, we can better understand the work we'll face in the future.

A niche skill set (1970s)

In the 1970s, computers were still an object of fascination - viewed by the masses as something unique to government, corporate conglomerates and science fiction. 

Computer technology was a rarity, used primarily by hospitals and large companies. David Bennet, long-time worker in Australian IT, recalls building a "master index so patients could be looked up using a keyboard instead of via microfiche and physical cards" for an Australian hospital around 1974. 

IT people from this era dealt largely in the physical, working out ways to translate data storage into tactical use. 

Corporations such as IBM and Sperry (now Unisys) competed to secure government contracts, dealing with bulky hardware that offered little storage. One of the most popular inventions of this era was the patented IBM 5.25-inch floppy disk that offered a whopping 110 kilobytes of data.

Today, over 90 per cent of small and medium businesses depend on digital tools for communication.

For anyone born before the 90s, however, we know that technology wasn't always like this. In fact, computers were scarcely adopted by many businesses at all until the 1980s. 

Personalizing the computer (1980s)

In 1981, Microsoft and IBM released their first branded personal computer, designed for use in the common workplace. Running MS-DOS and sporting a total 320Kb in storage, the IBM Model 5150 kicked off a gradual normalization of business computing. 

Despite running a near monopoly on the market, computers were still something of a novelty, with this PCs' biggest year in sales being 750,000 units in 1983. 

While PCs were being made more widely available, they weren't exactly a household staple. One might find the odd computer under a rich family's christmas tree, or in the finance department of a well-off small to medium business. 

Business IT responsibilities were often outsourced, or simply allocated to the person who spent the most time working with the office PC. Their work typically involved introducing and managing spreadsheet capabilities, and upgrading to eventual accounting software. 

By 1989, however, workplace demand for computers was lucrative enough for Microsoft to release Microsoft Office, encompassing the earliest iterations of Word, Excel and PowerPoint.

Vendors were introducing user friendly features such as graphical user interfaces and mouses, both of which were popularized by Apple's Lisa computer at the turn of 1983.

Also nearing the end of the decade, storage saw major improvements not in capacity, but in optimization via the introduction of zip drives and tape libraries. 

IT Managers ended the era of glam rock by introducing humble, mostly isolated systems with an emphasis on usability and data management - largely unaware of the explosive innovations to come. 

Hello Internet (1990s)

There were early rumblings of an "internet" in the 80s, but they were largely misunderstood and drenched in rumor. 

Between 1993 and the very early 2000s, people largely connected to the internet via the ear-piercing dial-up offerings of providers such as Yahoo, or purchased time-limited internet disks via AOL.

Both the methods of connection and exploring the web were clunky and unstructured, yet by 1996, there were a collective 36 million users online. 

In the same year, webmail clients were being introduced to the workplace, and IT managers were scrambling to manage email accounts and passwords within their organization. The infamous "You've got mail!" notification quickly rose to pop culture status, and businesses were building both websites and online stores with the assistance of IT experts.

The value of an internet presence was quickly apparent, and the number of globally registered domains grew from 120,000 to 2 million in three years. 

Furthermore, Windows, Mac and Linux were simultaneously introducing foundational operating systems such as Windows 95 and MacOS, and, well, Linux - ultimately serving as ubiquitous technologies that bridged the gaps between household and corporate PCs.

Between overseeing emails, websites, and a plethora of newly incorporated OS-friendly softwares in the workplace, IT managers worked diligently on contained, yet radically evolving corporate systems. 

By the end of the decade, an Australian man invented the technology later adopted for WiFi, and the number of people exploring the web reached 360 million. 

Clouds above (2000s-2010)

While IT managers had their hands full connecting their businesses to the internet, Apple, Google, Windows and Amazon were hard at work on era-defining technology set for the 2000s.

In 2006, Google released Google Apps (now known as G Suite), the SaaS platform that eventually included Gmail, Google Docs, Drive, and Calendar. 

One year later, Apple released the iPhone, introducing a cultural shift that bridged the gaps between home and the workspace. 

Employees took their work home in their pocket, and in kind, started to bring their own devices (BYOD) to work in the following years. 

The security and managerial implications of these cultural changes were massive, and radically shifted the range of responsibilities that IT managers had in administrating staff accounts, devices and applications.

Furthermore, AWS, Google App Engine, OpenStack and Windows Azure all released around this time, introducing new levels of accessibility for businesses looking to establish a cloud setup. 

Slowly but surely, IT Managers found themselves configuring private clouds that enabled corporations to access the same databases from across the globe, and gradually incorporated streamlined Infrastructure-as-a-Service solutions across their company.

Now (and Beyond)

In 2011, hybrid cloud (a mode of cloud wherein privately owned servers and infrastructure are shared semi-publicly, with select, agreed upon organizations) started to gain traction.

IT responsibilities began to encompass allocation and ongoing maintenance of off-site resources, and servicing users from far-reaching locations under unified systems.

By 2014, cloud computing was the norm of organizational IT. The industry had shifted from computers in the workplace, to complex, interconnected systems of data management, resource optimization, third-party risk, security and compliance. 

While mainstay Software-as-a-Service (SaaS) platforms such as Salesforce, Hubspot, G Suite and Wordpress had existed for years, this era also saw a widespread proliferation of SaaS as the preferred way of operating business software.

And today, 94% of companies use cloud services.

Looking back through the decades, the running theme seems to be whiplash; technological changes coming and going, redefining careers faster than we can keep up. 

IT workers have a habit of learning how to use technology years after it's made a major impact, and given that history repeats itself, many companies are ill equipped for this decades defining IT challenge: Shadow IT

The mass adoption of remote working, largely on account of the ongoing global pandemic, has exacerbated decentralization in IT. 

Workers typically use multiple work devices, a plethora of sanctioned and unsanctioned apps, and collectively connect from hundreds of different networks in any given week, at any given hour.

With ever-changing regulatory demands, and companies such as Meta gearing to once again change the face of technology, it can be difficult to pinpoint what a career in IT management will look like ten years from now, but it's crucial to understand and work with the state of IT today.

The best way to prepare for the future is with visibility and control over the technology in your business now.

Previous
Previous

Why businesses are picking up more SaaS

Next
Next

Three common types of SaaS misconfiguration (and how to fix them)