Info: I make my living in that area and teach programmers the ways of the sharp "C". I am not a manager (yet) though. To me, it is important to know WHAT you are doing just as much as the various HOWS.
With every software development process, there's a few things in common. One: There's a set goal to achieve. Two: We want to get there with as little work as possible.
There are also a few goals that are "implicit" and almost never defined but always assumed by everybody (except the developer): One: it has to be secure. Two: it has to scale well. Three: it has to be easily maintained. Four: It should cost nothing. You can already guess that these things contradict one another, can you? I'll go into details later on.
My description here assumes that all the preliminary work of specifications and systems/frameworks to use are already made.
The main problem in the real world is a glaring gap between how software CAN be written and how software IS written. Good software that manages to evenly balance all of the above requirements requires some highly qualified and motivated coders... not the kind you get for cheap. And here's where reality kicks the software industry in the butt: cheap "programmers" can do just that: "program". They can read specs and somehow manage to translate them into code that works. But they have next to no understanding of how that thing really works and what kind of mistakes they just made all over again. They can put their days or sometimes even weeks of training to semi-good use and "make it work". Do they know anything about security? Performance? Do they try their code with "just one client" or do they think ahead about a load test? How about an intrusion test? No way. They are busy trying to get that SQL statement to work. A skilled programmer will spend the same amount of time - at a higher paycheck hopefully! - optimizing that same query and make it robust.
Sadly, money talks and has ultimate control over the development process. This is why there are next to no skilled programmers on the job and more and more "new programmers" as they pop out of school. They won't grow old in this business either. In my experience, once they have gained some working knowledge and - of course! - want more pay, they either get promoted to a place where they can't do any good anymore or they get fired to hire the next cheap guy.
How can I make such a statement? Look around the business! I feel like "the old guy" and I'm just 38... When I started, there were programmers as old as I'm now... where are they? No longer programming or just gone for the most part!
And how is it possible that new software STILL has stack overflow exploits and even SQL injection possibilities?
Of course management (and academics) has realized that and has pushed forward something that I dread: "Patterns"... Ugh. Patterns are a crutch (or banana) to get the code-monkey-programmer to do type what's considered "good code" instead of flinging their excrement at the compiler. A skilled programmer doesn't need "Patterns" - they come naturally and are applied "as needed" and not "as dictated". A fine difference here! Code-Monkeys get told: "implement this programm using patterns X, Y and Z because that's our policy" - skilled programmers make notes of the goal, decide what pattern fits best and often will mix and match.
I can hear all of those "software engineering process management certificate bearers" cringe already... "But what about SCRUM(or whatever it's called) and what about this bible of patterns?" - again: that's for code monkeys. Skilled programmers will have in-depth knowledge of the programming language and don't need comments like "We implement an event listener, pattern 69" - they will see that. They need comments like "We check for condition X because of specs section 7".
But again: if you let skilled programmers make good use of their time by NOT forcing "schema X" on their code, chances are that you have to replace the skilled programmer with another skilled programmer... uh-oh... no financial benefit in firing that guy! Can't have that!
And the net result is: we have convoluted code that runs in a framework that's based on a framework that's part of another framework that uses compatiblity features of the OS to run... Don't get me wrong here: I'm not suggesting to move back to assembly language! That's a thing of the past for applications. But using things like "hibernate" and/or "entity framework" to wrap simple SQL into several layers of code and libraries just to execute the SQL that we could just write ourselfes? Ugh. Just Ugh. Why are people using it though? Because it creates safe code without having to learn about SQL. Yay for code monkeys! I haven't seen one such library that wouldn't have issues when "extensions" or "changes" in the data model come along but the main slogan is "it just works and makes your life easier". I tried. Nope. It won't. Not when you know how to do it the real way.
Another big gripe I have with the way Visual Studio is evolving is the overuse of "nuget" references... While it's a not-too-bad idea to have a utility that allows you to look up and download free libraries, I can't fathom why anybody would want to automate that task to the very source code level... I recently had to spend an entire day "upgrading" my code to use a newer version of a library because I added a sidecar library which also needed that reference and the version that I had in use was no longer available. Yay for imaginary comfort! Downloaded and manually referenced libraries just work for as long as I want them to work. And I choose to update them when I'm good and ready and not at the worst possible time when there's no time.
After venting some about "patterns" and "libraries upon libraries" I have to point out that I'm at least partially happy with the recent uprise in "unit testing". I can totally see the benefits of unit testing: "will this bugfix cause problems?" - click - No! Yay! See? That's a benefit. This brings forth two questions: "Am I using unit tests?" and "Why only partially?". The answers in short are "No" and "UI-Testing". In detail: I don't use unit testing myself because I firmly believe that it has to be at least a two person team working on the project for unit tests to be more than "just work". If I code the unit test for the code that I either just wrote or am about to write, I'll inevidably avoid causing errors... i.e. the unit test will be trimmed to the implementation, not the actual requirement. But I do recommend unit tests to every 2+ person teams! And why the "UI-Testing?" Simply because I don't believe in automated UI testing. If the application is properly coded, the UI will not funcitonally suffer from changes in code (that are tested by unit tests) - however, asserting the "OK"ness of a UI always involves a U to I... or a "user to interface" with the application. Everything else will yield funny results. Like white text on white background... the UI test will of course see the text because it's not reading the screen but the memory... but the user will not. Hence I recommend strict UI/code separation and running the UI tests manually, after all the code-tests succeed and the product is about to go final. To put it in managerial lingo: the possible false positive ratio of automated ui-tests is too high to be a real benefit.
To come back to the "hurdle" reference of the title: skilled programmers as driving force in a project I'd compare to sprinters, whereas code-monkey-projects tend to look like pole-jump-hurdle-relay races...
Bottom line: as long as we use code-monkeys instead of programmers, we are bound to end up with convoluted, vulnerable software when we pay (less) for it. Free software isn't any better overall. Here the code-monkeys sometimes think they can create a software project and now we have duct-tape code all over. But free software is often also open source software so the educated amongst us can decide if they want to use it or not.
There you go, that was some venting about the current (sad) state of affairs in the world of software development.