As an instructor and a current software developer, what are you offering in your class curriculum that is more real-world applicable?
Do you feel that colleges are currently offering students who major in IT, skills in the most current languages and techniques to help them in the workplace?
This is actually a pretty subtle question with a lot of gray area.
The purpose of higher education is to teach students "how to think" not "what to think". Thus it is better to have high quality education on fundamental principles of software programming (coding principles) and software engineering (design skill and team building). However, this doesn't mean that an IT degree should be supported by ancient technologies (in Internet "years").
For example, RIT (http://www.se.rit.edu/) is currently teaching Python and Java in introductory courses but these languages are not exactly the vanguard of modern computing. But from a pedagogical view these languages support the goals of the education.
Of course, the choice of technologies changes from time to time to adjust to *major* changes in the industry. So, for example, when I was in college (80's) Pascal, Fortran and C were the choices. RIT has an industry advisor board that helps guide the faculty to discuss such changes to the curriculum.
As a side note, you also asked about "real-world applicability." What is important to meet this requirement is:(a) software projects are team-based (not individual)(b) the course project is one, moderately big application(c) delivery of the project has multiple iterations (usually just two)This is how the RIT SE program does it and it works really well.