I am old enough to remember the promise of 4GLs (Fourth Generation Languages), which got the "Reuse" movement going. It was a start but was very manual and still required immense amounts of code writing.

Today "Low Code" and even "No Code" environments have matured to the point where they should be the default option for anyone considering the development of an application.

However, the interesting thing is not the faster/cheaper/higher quality route to a deployed application. What most people miss is what happens AFTER you deliver the application.

Maintenance and the ability to rapidly reconfigure an application cost companies the most money. Because of market forces, customer needs, new product/service ideas or any other of the numerous reasons to change an application or a process, companies have to rewrite what the Java and C++ boys did - in new raw code. This is a massive job of re-analysis of what was done before and changes done on (almost always) undocumented systems.

Having the ability to just drag & drop, with automatically generated version control and full documentation blows away the cost and delivery times of waterfall java projects, not to mention the quality. Modern companies need to be agile - very agile. Being unable to amend your applications has enormous opportunity costs which most organisations neither quantify nor fully understand.

I know java programmers have a living to earn but the writing is on the wall guys - in HUGE RED CAPITAL LETTERS!