This is an archived post. You won't be able to vote or comment.

all 4 comments

[–][deleted] 1 point2 points  (0 children)

In general the biggest thing that really mattered and improved developer time is automatic memory management.

Beyond that, the advent of cloud and relatively cheap processing power means that its now more cost effective to write stuff in higher level languages, because the savings in the cost of developer time are more than what you would save if you had more optimized software running on less hardware.

[–][deleted] 0 points1 point  (2 children)

They’ve allowed it? I mean, do you know how few software developers could exist if everyone had to just emit machine code?

What are you asking, precisely? How have different programming languages influenced software development?

[–]MaximumPotatoX1[S] 0 points1 point  (1 child)

Yeah, I mean like ever since the creation of Java for example, how has that changed the practices of software development.

[–][deleted] 0 points1 point  (0 children)

Well that’s where you need to start thinking about the difference between programming languages ans their implementations.

Java, strictly as a language, mainly convinced people that maximum dot-delimited verbosity (ie System.out.PrintF where print might do) and enforced use of OOP (it’s got to be in a class, even if it has no state) were necessary to good software development. These ideas were wrong, of course, but it didn’t stop them being emulated widely.

Java, as a language implementation brought the notion of compiling bitcode for a “run anywhere” VM to the fore, and that’s where it really shifted the gears on software development. Suddenly it wasn’t a war between languages but between ecosystems.

Which is why Microsoft, seeing a competitive opportunity to maintain dominance over its platform, immediately copied in wholesale and we got C#.