Steps to reproduce:
- Start a Node project that uses at least five direct dependencies.
- Leave it alone for three months.
- Come back and try to install it.
Something in the dependency tree will yell at you that it is deprecated or discontinued. That thing will not be one of your direct dependencies.
NPM will tell you that you have at least one security vulnerability. At least one of the vulnerabilities will be impossible to trigger in your particular application. At least one of the vulnerabilities will not be able to be fixed by updating the versions of your dependencies.
(I am sure I exaggerate, but not by much!)
Why is it like this? How many hours per week does this running-to-stay-in-place cost the average Node project? How many hours per week of developer time is the minimum viable Node project actually supposed to have available?
Most of those are server-side languages and I’d disagree with the assessment. For web services Java needs some kind of server like Jetty, Undertow, or Tomcat. For testing you need JUnit or NGTest. And for common, everyday utilities you need something like Apache Commons or Guava. These things don’t “ship” with Java (and there’s actually a fair amount of runtimes now, it’s not only Oracle).
The thing that seems to benefit Java over Node is major corporate support (Oracle, RedHat, IBM) so for better or worse you can usually rely on a handful of essential tools being updated regularly.
I wouldn’t say you need no dependencies in a Java project, but by all means check the average number of dependencies you get with Java or Python and compare it to almost any Node project.
You could probably sample projects on GitHub, look at the dependency graph, and compare.