Calm down, everybody.
Java might (might) be a valid solution for what Nadaka is asked to provide. With the obligatory caveat of Whateverthatisandifhetoldmehe'dhavetokillme, of course.
I've known deep-sea sensor platforms programmed in Forth, being considered more fit to work with the data inputs and microprocessor setup than anything bar intensively hand-coded assembler. Modern manned submersibles almost certainly have something else in any programmable systems.
I wouldn't use COBOL for rocket telemetry but for businesses... well,
I still wouldn't use COBOL[1], but I can see how numerate but not 'computerate' accountants could appreciate and audit the code, even those that can't write it.
I'm a fan of Perl (I do tend to mention that) because I can rapidly prototype things. The other day I did something with prime factorisation for which I was happy to have the freedom to create both arbitrary-length arrays and similarly expandable hashes, plus embed arrays within hashes and hashes within arrays for various categorisation and post-processing. But the overheads are a killer and once I'd tested it running for all the numbers from 1 to 10, from 1 to 100 and 1 to 1,000, the jump to investigating all the numbers up to 1,000,000 gave me some time to spare to do other things. If I'd done the code in something more compiled (e.g. one or other flavour of C) or possibly even gone the route of the part-compiled Java, I'd have been less quick on the coding and testing[2] but it would have been easily far faster to run the First Million tester, and I might even have been able to get a Thousand Million[3] run within the same time-scale. (Although I'd have already proven to myself what I'd wanted to prove.)
Ask me what I'd use for manned space-flight (especially if I was atop the rocket!), and I'd not use Java or COBOL or Forth, definitely not Perl, would avoid Pascal, stay away from all OO language implementations[4] yet would also be cautious of assembler-level programming (being too spoilt by the ability to use mid-to-high level coding techniques) as well. No. I don't know what I'd use. Probably not a time to dig out my old FORTRAN notes, either...
Ada was allegedly used to program missiles with, but the rust on my skill with that is currently a foot thick, even if the language itself is still considered notionally fit for purpose.
Tell you what,
don't ask me to program a manned space-flight, those guys who already do it are probably very happy with their Assembler or whatavyer and I'd rather not know that they use BASIC or ALGOL (or Lisp or bash scripts or just compile it straight from Z-notation) and just trust them. More than I'd trust me, if I were on top of the stack, regardless of whatever inflated opinion I might have in my own capabilities.
But I could rustle up that same Prime Number script in most of the languages I've mentioned (and others besides). Inefficiently, probably, but it'd work until it finished. Or hit the memory limits of the system. (As famously happened with my Rubics Cube simulator for the BBC Micro... Which couldn't have been a more early '80s things to have attempted unless I'd also been sitting on a space-hopper whilst to Spandau Ballet through a Sony Walkman.
)
Erm... the original point was that Java's good for some things (mostly, dare I say, webapp-like implementations where cross-platform functionality is king), so don't bash it. It doesn't mesh particularly well with my current coding style and needs, of course.
[1] Godawful language with its ENVIRONMENT DIVISION stuff and compiler-mandatory indenting scheme, bearing in mind it's been around 20 years since I last typed any of it in...
[2] Partly because I flit around programming languages according to demand, and I definitely get 'skill-rust' when I've not actually used a language for a while. Usually I start trying to do something that's actually only possible in a language which is a cousin of the one I'm trying to use... Unfaithful me. Jack Of All Trades, Master Of None.
[3] Billion to some of the world, a Milliard to the others. For the record I personally prefer long-scale measures (finding the alternative akin to institutionalised exaggeration) but avoid the '-ard's in favour of "Thousand Million", etc. Or stick with 10^N notation.
[4] Although I acknowledge there's always good compile-time sanity checking for class compatibility, I don't trust the degree of overheads. But that rules out Java, all the C++/C#/.NET etc stuff, and (if not already ruled out) the likes of Delphi and similar OO developments of various older languages.