World News Intel

Filling key IT roles is tough these days, especially in the cybersecurity area. Cengage CTO Jim Chilton shares valuable advice on overcoming top challenges.

Attract and retain cybersecurity talent

Attracting talent is a significant industry challenge – resulting in the 700,000 open U.S. cybersecurity jobs today.

One shortsighted action many organizations take is requiring a traditional four-year degree in computer science, cybersecurity, or a related field for an entry-level role. Cengage’s 2022 Employability report found that 57 percent of tech employers would decline to interview a candidate who didn’t have a degree, regardless of whether the candidate had the necessary experience, despite 44 percent of tech workers starting their careers in another field.

A successful career in cybersecurity does not necessitate a degree in cybersecurity or technology. Employers must align by loosening stringent degree requirements and, instead, evaluating candidates based on their skillsets and aptitude for the job. This includes welcoming candidates with certifications or badges from completing credible IT courses on foundational concepts like cloud security, zero-trust, coding fundamentals, or data science.

In the interview process, IT leaders can gain a better sense of overall aptitude and soft skills by asking about their past work on teams and collaboration, how they manage time and priorities, and gauging their enthusiasm for learning and growing skills.

The employee retention side is also pretty bleak, with nearly half of the cybersecurity professionals considering leaving the industry this year. As the number of cyberattacks remains high, so too are the stress levels of cybersecurity workers, feeling as if they are “always on call.” This goes hand in hand with recruitment – unfilled vacancies mean more work for fewer people. To better retain workers (especially amid labor shortages),

​​Organizations should focus on adequately training all their employees to recognize potential threats, as many attacks are easily preventable. For example, more than a third of all cyber-attacks result from phishing. To combat this, organizations must mandate routine IT and cybersecurity awareness training for their entire workforce to reduce their risk of being hacked.

[ Related read: IT hiring strategies – and 5 illuminating interview questions to ask candidates in 2023. ]

Sharing best practices on how to spot common hacker strategies such as phishing will ultimately result in fewer preventable cyberattacks, less stress for cybersecurity workers, and, ideally, make it easier to retain IT and cybersecurity professionals.

Implement apprenticeship programs

In August, the Department of Labor announced a new Cybersecurity Apprenticeship Sprint program to combat industry talent shortages. 

Apprenticeships are extremely valuable for both employers and candidates. For employers, apprenticeships are a cost-effective way to groom talent, providing real-world training and a skilled employee at the end of the program. Apprenticeship programs also reduce the ever-present risk of hiring a full-time entry-level employee, who may prove to not be up to the required standard or decide for themselves that the organization or industry is not a fit.

For workers, an apprenticeship is essentially a crash course providing the opportunity to earn while they learn. With the average college graduate taking on $30,000 in debt (and many taking on much more), a degree has increasingly become out of financial reach for many Americans. Apprenticeships are an excellent way for people to gain tangible work experience and applicable skills while also providing a trial run to determine whether a career in cybersecurity is right for them. For me, apprenticeship programs are a true win-win.

During National Apprenticeship Week this year, we joined the Department of Labor’s event at the White House to celebrate the culmination of the 120-day Cybersecurity Apprenticeship Sprint. It is exciting to join other tech leaders and brands focused on tackling the serious skills and labor gap in this high-demand industry.

Train employees for rapid innovation

Industry leaders often point out rapid technological advancements outpacing the ability to train the workforce properly. 

With roughly 2,200 cyberattacks each day, it’s clear that hackers and the technologies they leverage are becoming more sophisticated.

It’s unrealistic to expect recent college graduates, apprenticeship graduates, or credential earners to possess all the skills needed for a decades-long career in cybersecurity. Learning and development must be key components of the employee lifecycle to keep up with the fast-paced growth of new hacking threats.

Employers must commit to continuously upskilling their cybersecurity workers. Leadership can’t expect overburdened tech teams to also take it upon themselves to study new hacking techniques or enroll in a course to learn the latest security software – the onus is on employers.

Learning and development must be key components of the employee lifecycle to keep up with the fast-paced growth of new hacking threats.

Employers should be in regular contact with their IT teams, asking them what additional tools they may need and providing sponsored opportunities for training. Consider offering personalized training and certification recommendations for the in-demand cybersecurity positions, enabling enterprises to upskill and cross-train talent at scale.

Rethink your recruiting practices

The tech industry, in general, is notorious for its lack of diversity. Currently, 25 percent and 7 percent of tech workers are women and Black, respectively. 

Representation for women and people of color in big tech is significantly below the national average across all sectors. To begin chipping away at the marked lack of diversity, we need to meet diverse talent pools where they are and revisit hiring practices.

We already discussed the value of apprenticeships and the role of skills-based hiring, but not from a DEI perspective. Many employers still require a degree for entry-level jobs. However, research from Opportunity@Work has shown that adding a four-year degree requirement automatically screens out 76 percent of African Americans, 81 percent of Americans in rural communities, and 83 percent of Latinx workers.

By rethinking degree requirements, considering candidates from non-traditional education paths, and facilitating apprenticeships (especially for entry-level roles), tech teams can welcome diverse and skilled talent into their organizations.

Employers can also consider forming partnerships with community colleges and HBCUs, which tend to have more students from diverse backgrounds. By partnering with these educational institutions, employers can directly engage with student populations often overlooked and comprised of high-achieving minority students (for example, 33 percent of Black high school graduates with a GPA of 3.5 or higher attend community colleges).

Employers can expand and diversify their talent pipeline by providing these students with real-world learning opportunities, such as internships, and building their skills to take on entry-level tech roles eventually.

Once diverse talent is on board, employers must also ensure they feel supported and belong. This will require a cultural shift within IT teams and technology companies, with attention to implementing comprehensive DE&I strategies, Employee Resource Groups (ERGs), and equal opportunities for underrepresented demographics to learn, grow, and advance to more senior roles.

[ Want more expert insights on leadership, strategy, career development, and more? Download the Ebook: 37 award-winning CIOs share essential IT career advice. ]

enterprisersproject

Share.
Leave A Reply

Exit mobile version

Subscribe For Latest Updates

Sign up to best of business news, informed analysis and opinions on what matters to you.
Invalid email address
We promise not to spam you. You can unsubscribe at any time.
Thanks for subscribing!