Learning to develop software has become a lot harder. You may argue that programming is still the same; we still have the loops, the conditional statements, the same patterns and maybe even similar data access technologies, so why shouldn't writing a sort-of-useful data enabled application be the same as it was 10 year back?
Thing is programming is still the same but software development is much more than programming. I found this out when I sold my first software in 1996, a Windows File Manager. The users' didn't just expect an .EXE. In addition to developing the application, I also had to learn how to make .HLP (Windows help) files, create an installer think about licensing and versioning among other things.
Unfortunately developing a complete software has become a lot harder today. Things which could be taken for granted cannot be taken for granted any more. Take platform - the operating system for instance. The number of platforms which people use has changed drastically. PC was the center of an average technology user’s life, Linux was in its infancy and MacOS was unheard of. Backward compatibility and even forward compatibility was almost guaranteed on Windows mainly because of the stability of the Win32 API. Most Windows applications would work on any Windows flavor without any change. Life was good! Today the PC is no longer the center of a tech-savvy person's life, it his/her phone or tablet. To complicate matters further, you can't even take the mobile / tablet's operating system as a constant, it could be Android or iOS or maybe even Windows. Being a good programmer is again not a guarantee of making something usable, you'll also need to have above average skills in user-experience and UI design. Users have become quite intolerant of average looking software. Let's not forget about the hoops a mobile developer has to jump through to get an application published.
With the increase in connectedness via the Internet, most application now need to have something to do with enabling data access & sharing and that too from almost anywhere. It’s no longer enough to just know how to program the device but you also need to know how to make the data available to the application via the Internet. Now coming to the Internet, we again have a plethora of options each coming with its own steep learning curve. AWS? Azure? Google? Who uses traditional hosting anyway? And if you do you aren't going to get any respect from snooty architects. Oh and by the way did you write automated unit tests? No? What a shame, you're not a real developer! So now you've nailed down the Cloud provider you want to use for you application but it’s difficult to continue to use this infrastructure in a cost effective way - basically zero cost since all you wanted to make was a somewhat-useful application with the main intention of learning new technology. Learning technology doesn’t come cheap anymore.
Gone are the days when cutting edge technology meant learning things like desktop programming, distributed programming using Java or DCOM or maybe building a dynamic website with pHp and MySQL all of which could be done using a single machine at home and feeling that you have learnt something substantial. Today's technology world respects only the mobile, the cloud, big-data and analytics.