Software development is becoming more complex as the requirements for software increase. Customers want more advanced features and a good level of optimization.
As an IT person, I can only say that because of the tools available to integrate software developed by multiple people and keep it working it got so much easier and faster to fight bugs than say 10 to 20 years ago, that I would state the opposite argument. Software development and integration has become so much better that it puzzles me why companies keep making the same mistakes over and over again.
Ide's, the tools in which we develop software have become so good in supporting the developer; there is great software version control available like Git; many Cloud solutions are almost plug and play making the pieces of code ever smaller (a good thing for stability and quality control). Pipelines to integrate testing and automatically run them with every change to prevent faults on parts that were built before are default practice almost everywhere. Docker images to prevent configuration changes between test, acceptance and production environment are another example of a very good solution to a problem we had before but now hardly experience.
In short: it all got easier and better integrated from software development perspective. The observation that I share is that Windows software keeps being buggy on parts you wouldn't expect. Like for example crashing after locking the screen and putting the computer to sleep by closing the laptop. I was greeted by a black screen and computer that wouldn't come back out of its sleep state but by hard reset and restarting. Ime this sort of thing happens less with Mac OS and linux based systems, even though Microsoft OS and Cloud based services is a breath of fresh are compared to windows 95, XP, 2000 etc.