Recently, I came across this interesting article about the skills gap in the tech sector. I went in ready to read about how technology is evolving at a rapid pace, how new technologies are being introduced every day, and how employers are desperately looking for technologists with the right skill set. However, the author took an unusual stance on the topic and proposed that the skills gap does not exist, at least not to the extent at which I believe.
On one hand, you have reports stating that just in two years, there will be one million unfilled job openings in the tech sector, whereas universities will be producing only 400,000 computer science graduates, and on the other hand, you have this other narrative, where jobs that require higher level computer skills are filled easily.
Are colleges creating the talent needed in tech?
As someone who has been writing about the IT skills gap endlessly, I wanted to make sure I do my research and answer a few quick questions before I dive deeper into the topic. The first question that needs to be addressed: Is there an IT skills gap or a skill set mismatch?
The answer is a resounding yes.
According to the U.S. Bureau of Labor Statistics, there will be roughly 1.3 million occupational openings for IT professionals by 2026. Compare that with the number of computer science graduates being produced every year (close to 60,000 according to the National Center for Educational Statistics) and we have a skills gap.
Furthermore, the skills gap is not just due to the number of computer science graduates being produced every year, it exists because technology is moving at a pace that simply cannot be matched in the academic world. Hundreds of thousands of patents are being filed every year. We live in a world of self-teaching AIs and self-driving cars, and have reached the point in our technological evolution where we are seriously thinking about and planning to colonize Mars.
In a world where more than half the population has access to all the information known to man, conventional education simply cannot keep up, even if they revise their curriculum on an annual basis. The cost associated with the activity alone makes it impossible for colleges to churn out the kind of graduates that can take over high level jobs in the tech sector.
Do you need a college degree to start a career in tech?
Let’s revisit the report by the Bureau of Labor Statistics. It considered 17 major occupations in computer science and grouped some into one occupation. Out of the 17 occupations considered:
- One requires some college, no degree
- Two require an Associates’ degree
- Nine require a Bachelor’s degree
- One requires a Master’s degree (Computer and Information Research Scientist)
- Four require no college, no degree
Does the degree help you earn a better paycheck?
Yes. Almost all occupations with a median wage of $100,000 and above require at least a Bachelor’s degree.
Do you need a degree to get started in tech and make upwards of $100,000 per year? Is there ROI on the cost of getting your bachelors in CS?
No. Database and systems administrators, network architects, software developers, and programmers do not require a degree or college education to get started, and the annual median income for these occupations ranges between $87,000 to $97,000. Given that this is the median income, one can safely assume that there are people in these professions making $50,000 per year and then there are those making upwards of $100,000.
One important factor to consider here is that this data represents jobs, i.e. you acquire the skills and/or degree and start working for someone. It does not consider freelancers, contractors, and entrepreneurs who work for themselves and use their skills to generate a healthy income.
What can you do when a college education is not enough?
Colleges and universities are doing good work in educating the masses, but given the pace at which technology is evolving, it is up to the technologists of the world to take charge of their learning and continue their pursuit of knowledge through formal and informal learning to keep up with the industry. In North America, IT professionals think it’s the responsibility of their employer, but in the rest of the world, individuals invest in their own IT learning and development. If your employer is not ready to invest in your development, you need to take matter into your own hands, but you cannot afford to stop learning new skills and fall behind.
If you refuse to learn new skills in IT, your skill set becomes dated and obsolete. When you feel that your daily tasks can be automated, and that a bot can do your job better than you, you need to acquire new skills to stay relevant and ahead of the curve (maybe learn to build said bots).
Having said that, technical education, specially IT skills training, does not come cheap. Sometimes you will have to set aside hundreds of dollars for it, and sometimes the amount can go up to thousands. Finding that kind of money in your budget, and finding the time needed to take courses and learn new skills, can prove to be challenging to say the least. That is why, I believe that it is not just the responsibility of the technologists to invest in their development, but also the organizations that employ them.
The role of organizations in employee development
I have been a longtime proponent of lifelong learning, and that’s before I got into the business of IT skills training. All of us should become lifelong students and dedicate our lives to gaining knowledge and using it for the betterment of humanity. This stands doubly true for people (and organizations) in the tech sector. The concept of a teaching organization is nothing new, and there are a lot of organizations out there that invest a ton of resources in employee development. While it sounds great in theory, it does come with its set of challenges.
The biggest challenge is cultivating a culture of lifelong learning in your organization. Your employees may be great at their jobs, and they may be adept at using the technology you currently have at hand, but can they handle new technology?
- Are their skill sets aligned with your technology roadmap?
- Is there a skill set mismatch?
- Are they open to the idea of learning new skills?
- Can they complete their projects on time, on cost, with high quality?
Creating a good employee development plan
When you try to address the above questions, it will lead you to building an “IT skills learning and development” road map that is tied to your technology roadmap. It is easier said than done, because building an employee development program that works is challenging to say the least. You cannot just rely on your L&D department to come up with that plan. You as an IT leader must take that initiative or work with other L&D resources within your organization to create a roadmap. Everyone in your team has different levels of experience, different educational background, and they all have different ways of acquiring knowledge, retaining it, and implementing what they’ve learned
Some technologists like to read and learn new skills by reading books, articles, and blogs. Some like to learn through video tutorials and webinars, and some prefer good old-fashioned training sessions in a classroom (or a virtual classroom).
The rules and methods of learning are constantly changing and drastically different for everyone. We (at QuickStart, of which I am the CEO) did a survey in which it indicated that 40 percent of the IT professionals prefer informal learning, which means learning from webinars, YouTube videos, reading or writing blogs, attending events etc., and approximately 60 percent prefer to learn from formal learning methods such as classroom, virtual instructor led training or online IT training.
Your employee development plan needs to cater to different methods or modes of learning. Implement a solution that supports multi-modal knowledge transfer, which may include online IT training that may support a learner to go back and reference training material, virtual classrooms which allow a learner to interact with the instructor, social and peer-to-pear learning where you can learn from your peers, simulations and labs, discussion boards and so on.
Furthermore, the ability to track formal and informal learning along with analytics would be an icing on the cake. This would give you access to data that would tell you where your employees are spending their learning and development time and if they are investing their time that is aligned with your IT roadmap. All of this may sound excessive, but it is worth the investment as your employees will become more productive and it will reduce your project risk as well as turnover. Apart from an improved retention rate, you will be able to promote from within, and new leaders will start emerging from your technology team. This would substantially increase your brand and ROI which can be another topic, which I may address in my next blog
To summarize, the IT skills gap and skill set mismatch is very real. It is impossible for colleges and universities to keep up with the increased demand, and it is not just up to the employees to invest in their development, but it is critical for IT leadership to invest in training their employees to stay abreast of the latest technologies which are tied to their roadmap.
This article is published as part of the IDG Contributor Network. Want to Join?