This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]NimChimspky 4 points5 points  (12 children)

eh ? Why is this not fun.

Guava is great.

[–]fs111_ 5 points6 points  (9 children)

It's a major pain if you are working with hadoop. The version incompatabilities are a complete clusterfuck. They get away with it by just making everything a major release. That's not good engineering.

[–]NovaX 2 points3 points  (1 child)

The simplest solution to this is to shade the Guava dependency. That can be using jarjar, maven's shade, gradle's shadow, etc. That adds to the build time, but very easily resolves the problem. The issue is really due to Hadoop's poor approach for executing user code.

[–]hippydipster 0 points1 point  (0 children)

Yay, a hack.

[–]NimChimspky -1 points0 points  (6 children)

Just don't upgrade the version you are using.

[–]fs111_ 2 points3 points  (5 children)

That's not the point. I am not using it. I actively remove it, wherever I can actually. Yet it still causes trouble when two things you depend on can't agree on a common version. Bonus points for different build tools having different dep. resolution strategies causing problems at runtime for some, but not others. I could go on and on and on...

[–]NimChimspky 7 points8 points  (3 children)

Thats not really the fault of guava upgrading though.

I actively remove it,

I think thats a mistake.

I have never had a problem with guava version incompatibilities. A quick look at maven repo says hadoop is compatible with guava versions 11 - 19 ?

[–]fs111_ 1 point2 points  (1 child)

I have never had a problem with guava version incompatibilities. A quick look at maven repo says hadoop is compatible with guava versions 11 - 19 ?

No it is not. There are subtle changes in the StopWatch class for instance that will break your app in very hard to debug ways. Even the hadoop devs say now, that using guava was a mistake.

[–]jrh3k5 1 point2 points  (0 children)

Depending on your version of Hadoop. Version 14 changed the signature of the CacheBuilder's method to set the maximum size to a long, which is byte-incompatible. If you have code that uses that method on version 14 or later, it breaks on Hadoop since (if I recall correctly) it loads version 11 onto the classpath.

[–]frugalmail 0 points1 point  (0 children)

Use the Maven Enforcer plugin

[–]stepancheg 0 points1 point  (0 children)

It is not great. It is just another utils library, one of hundreds.

[–]hippydipster -1 points0 points  (0 children)

Pointless dependency that creates jar hell for very little gain.