Change has always been relatively fast-paced in the world of IT, but part of the appeal of DevOps is that it helps organizations be even more agile. With DevOps, containers, and microservices it seems that things change almost as quickly as you start to think you understand them. How can formal education–like college degree programs–or IT training worth investing the time and effort in possibly keep up and stay relevant?
I wrote this post about the challenges of keeping IT education on pace with the quickly-changing world of DevOps:
IT education has been a problem since the dawn of IT. Things change quickly in the world of technology, and it’s difficult for institutionalized education to keep up. By the time a college-level course is developed and the associated textbook has been written, edited, and published, much of the information is already obsolete. Now that we’ve entered the era of DevOps, that rate of change has reached hyper speed and doesn’t look like it will slow down any time soon.
A recent article described the ongoing struggle companies face to find the right skills and talent. The article cites a study from CA Technologies that highlights some of the challenges, including a growing skills gap in the apps economy as organizations shift to a software-driven mindset. The article sums it up: “Ultimately, the lack of skilled—or available—workers is fueling company concerns. According to the CA Technologies survey, 42 [percent] of respondents said that the lack of knowledge or requisite skills prevented them from effective responses within the app economy, with 52 [percent] saying that governmental assistance or investment in technical or STEM education could go a long way to closing the gap.”
Have schools and programs evolved to deliver a workforce with the skills to fit the modern dev and IT operations in practice in today’s enterprise? Can higher learning adapt to keep up and continue to provide value, or will the burden fall on smaller, industry-specific courses and certifications? Should students should just get in the trenches and teach themselves because formal education is just incapable of staying current? All of these questions have some validity. Let’s take a look at the pros and cons of each.
Formal education: Is computer science keeping up?
Formal computer courses have been out of date pretty much since there have been computers. By the time my school introduced an introductory course teaching BASIC on Apple II PCs, I had already taught myself BASIC on a Commodore 64 and ended up educating the teacher more than he educated me. It can take years to develop and approve a formal high school or college level course—and by that time the information is already out of date and the world has moved on to new things.
That said, there are still a number of advantages to formal education. “We can all make jokes about how slow the traditional educational system can be to change, but in terms of sheer resources it dwarfs the commercial training sector, and traditional undergraduate and graduate degrees still offer meaningful career value,” claims Charles Betz, founder of Digital Management Academy and adjunct faculty at the University of St. Thomas. “I believe that students coming out of these programs can and should be much better prepared for [the] digital industry and that this is not a merely ‘vocational’ objective.”
See the full post on TechBeacon: The state of DevOps education: Can training keep up with the tech?
- Navigating the Future of Secure Code Signing and Cryptography - December 20, 2024
- The Rise of Agentic AI: How Hyper-Automation is Reshaping Cybersecurity and the Workforce - December 20, 2024
- Exploring the Evolution of Cybersecurity Marketing - December 18, 2024