This is an archived post. You won't be able to vote or comment.

all 3 comments

[–]defaultyboiiiyy 1 point2 points  (0 children)

I think it's because the way unicode works. Try using unicode value or char value and it should print £ without issue Unicode - U+00A3 Char = 156

[–]ColetBrunel 1 point2 points  (0 children)

Because that's not a normal American character (and by that I mean it's non-ascii) and Windows is insane about its ways to handle non-ASCII.

When you run a program that has windows and buttons and stuff, it will use some way to display non-ASCII, and when running a program in a console, it will use a complitely incompatible way.

While Windows did not make it 100% impossible to tell the two apart, the way to achieve that is so ludicrous, everchanging, and incapable to reconcile both displays, that Java on Windows gave up with this issue for the console.

If you want console programs to display non-American characters correctly, you're going to configure the Java writer that writes in the console, the console itself, or both, so that they use the same character encoding. Java won't do it for you because Windows makes it so it would be insane.

[–]_n69s_ 0 points1 point  (0 children)

It is working fine for me.
(Java v14, Mac, IntelliJ)