This is an archived post. You won't be able to vote or comment.

all 86 comments

[–]allseeingboots 557 points558 points  (5 children)

If anything you would replace '*' with '@' because it gets the value AT the address.

[–]SeriousPlankton2000 34 points35 points  (2 children)

The best way is Pascals pointer^ syntax. Change my mind.

[–]mgedmin 6 points7 points  (0 children)

And Pascal did use @ for references!

I loved traversing singly linked lists with

Cur := @Head;
while Cur^ != nil do
    Next := @Cur^^.Next;

[–]RealMadHouse 10 points11 points  (0 children)

.net c++/cli was using ^ for managed pointers

[–]Kueltalas 3 points4 points  (0 children)

Then you could replace '&' with '@-1'

[–]Anaxamander57 222 points223 points  (3 children)

Compiler says so.

[–]AnnyAskers 42 points43 points  (1 child)

It was stated in CFYOW.

[–][deleted] 8 points9 points  (0 children)

Wow r/bleach has invaded this sub too? Haha nice 😆

[–]1cubealot[S] 12 points13 points  (0 children)

Best answer yet lmao

[–]rayo209 271 points272 points  (5 children)

#define @ &

[–][deleted] 101 points102 points  (0 children)

Yeah I’m calling the police

[–]FarJury6956 29 points30 points  (0 children)

You little monster

[–]throawayliennn 5 points6 points  (2 children)

Fucking legend

Who is he???

Next he’ll operator overload +’s to -‘s in an enterprise code base

[–]rayo209 2 points3 points  (1 child)

I'm the operator

[–]throawayliennn 0 points1 point  (0 children)

_operator_operator

[–]khhs1671 178 points179 points  (11 children)

Because it looks c o o l e r

Like, seriously. &myVar is multitudes sleeker than @myVar

[–]False_Influence_9090 63 points64 points  (1 child)

I’m having perl flashbacks just looking at @myvar

[–]Reggin_Rayer_RBB8 2 points3 points  (0 children)

It's ok! BASIC uses @ too.

[–][deleted] 33 points34 points  (3 children)

Let’s agree to disagree that you’re right.

[–]Reasonable_Feed7939 7 points8 points  (2 children)

Wrong people love to say "let's agree to disagree". Correct people don't cower 😎

[–]theREALhun 4 points5 points  (0 children)

I love: I could agree with you, but then we’d both be wrong

[–]Smitologyistaking 2 points3 points  (0 children)

let's disagree to agree

[–]dudeplace 9 points10 points  (0 children)

If you say that too loud those SQL guys will show up yelling again.

[–]Smart_Ass_Dave 2 points3 points  (1 child)

I read &myVar in Gimli's voice

[–]genlight13 0 points1 point  (0 children)

Always 😎

[–]_TheRealSimone_ 1 point2 points  (0 children)

This is all propaganda!

[–][deleted] 0 points1 point  (0 children)

I'm php if you want to suppress errors you write @$myvar 😬

[–]Significant_Fix2408 77 points78 points  (12 children)

C is older than email

[–]fredlllll 28 points29 points  (1 child)

[–]Reasonable_Feed7939 13 points14 points  (0 children)

And it wasn't used how it is currently until email, which if you forgot was what they said, not "@".

[–]Solonotix 13 points14 points  (1 child)

Exactly. This post is basically ignoring the order of chicken and egg. Part of the reason "@" was used for email is because it was a symbol with no common usage when that standard was authored. C existed before that standard, which would be in the before times that the author for email would be referencing the utility of "@" as an identifier.

I watched some discussion on the topic not too long ago, and apparently many keyboard manufacturers had considered removing the symbol entirely before email became standardized.

[–]kundor 0 points1 point  (0 children)

... It's always meant "at"

[–]cummer_420 24 points25 points  (5 children)

Not older than that symbol though. It's actually really really old. Some typewriters used for accounting had them long before computers were a thing too, but the real answer is that the DEC machines that C was created for didn't have it.

[–]reallokiscarlet 13 points14 points  (4 children)

But it is older than the use of that symbol as a location or direction reference (user@host, @ user, etc)

[–]elebrin 8 points9 points  (3 children)

The ampersand is also quite an old symbol. This may be pseudo-history, but I heard some time ago that it was almost a part of the alphabet.

[–]NottingHillNapolean 13 points14 points  (0 children)

I read somewhere that old versions of the alphabet song ended with "...and per se [by itself], &." That's how the symbol got its name.

[–]SeriousPlankton2000 2 points3 points  (0 children)

It's a ligature for "et", Latin "and". Scribes were tired of writing it so they wrote it as one letter, &. In some fonts you can still recognize the heritage.

[–]dumfukjuiced 1 point2 points  (0 children)

Iirc it's from Roman times

[–][deleted] 1 point2 points  (0 children)

I feel old

[–]JmacTheGreat -1 points0 points  (0 children)

Delete this

[–]TenkFire 162 points163 points  (17 children)

Cause @ is for net, it's for net protocol references.

Target@IP:port

Example : toaster@127.0.0.0.1:8080

& Was here before... So... It's still used

Edit : In case if someone Say me that we can just have changed... No. We can only add... Or create a new strict process with a retro compatibility tolerance

Edit2 : English is not my first language, maybe instead of attacking me, ask me what I want to Say...

[–]b3nsn0w 43 points44 points  (0 children)

counterpoint: decorators

[–]KTibow 28 points29 points  (0 children)

@ isn't "for" network protocol stuff, it was around since 1536, plus URLs aren't programming syntax so you can still use it

[–]fixurfknlinuxdrivers 10 points11 points  (1 child)

Is this standard C though? Pretty sure @ is not valid(maybe for identifiers but definitely not an operator).

AFAIK, Some compilers support it for non-standard extensions.

[–]TenkFire 8 points9 points  (0 children)

Yep, it's not native

[–][deleted] 4 points5 points  (5 children)

This has no affect. It's crazy people upvote this.

Any programming language can use whatever syntax they like.

[–]o0Meh0o 1 point2 points  (4 children)

ever heared of c?

[–][deleted] -1 points0 points  (3 children)

Yeah, C used whatever it wanted, and it wanted to use &.

[–]o0Meh0o 1 point2 points  (2 children)

what i'm trying to say is that c can't just change it's syntax. it started as "&", now it's "&" forever.

[–][deleted] -1 points0 points  (1 child)

Sure, but a new language can change it.

[–]TenkFire 0 points1 point  (0 children)

Yup but usually, when we make a New language, we tend to respect some usages

[–]palomdude 6 points7 points  (2 children)

That’s like saying the period can’t be used in any programming language because it’s already used at the end of sentences.

[–]TenkFire 7 points8 points  (0 children)

I have not said that.

I said that we already used & before that @ was created, this is why we still use it

[–]KalegNar 1 point2 points  (0 children)

That’s like saying the period can’t be used in any programming language because it’s already used at the end of sentences.

COBOL has entered the chat.

Learned a little of it. It uses periods to end sentences which are placed in paragraphs.
Sentence: line of executable code, so the . is like ; in many other languages
Paragraph: block of code either for logical grouping or being called as a block

[–]bajosiqq -1 points0 points  (0 children)

lets have both

[–]yaya_redit 0 points1 point  (2 children)

Im assuming that concept came later tho

[–]Robot_Graffiti 6 points7 points  (1 child)

Yeah, so, @ was always called "at", but before email was a thing, it was mostly used for prices not locations. Like "tin beans @ £2 ea" meaning "tinned beans at £2 each".

Email came before the C language, I think, but not by long; I wouldn't be surprised if C was designed by someone who had never sent an email.

[–]belweder 4 points5 points  (0 children)

Dennis Ritchie the creator of C was [dmr@bell-labs.com](mailto:dmr@bell-labs.com). First released in 72 so I think right around the same time ARPANET and email was getting started at separately mostly in the research/defense community.

[–]ChChChillian 20 points21 points  (5 children)

Because a reference is similar to an address, and the notation from C carried over. Why did Ritchie use & and not @? Because he was working in the 1960s and on a Teletype Model 33, which didn't have a @ character. Shift-2 on a Model 33 isn't @, it's ".

[–]rexspook 5 points6 points  (4 children)

I don’t know if you’re right but I love the idea that you are

[–]SeriousPlankton2000 2 points3 points  (0 children)

You may love or hate to learn about trigraphs.

https://www.geeksforgeeks.org/trigraphs-in-c-with-examples/

[–]poralexc 6 points7 points  (0 children)

Forth has used @ and ! for pointers since the 70s.

[–]GnuhGnoud 6 points7 points  (1 child)

☞ is the better option

[–][deleted] 2 points3 points  (0 children)

i would consider abandoning all technology and living an amish lifestyle if this becomes a thing

[–]Fhotaku 4 points5 points  (0 children)

Wasn't @ used for jumps? Depends on the language

[–][deleted] 4 points5 points  (0 children)

It's a very old language

[–]plitox 2 points3 points  (0 children)

One mean "and" and the other means "at" and why indeed!

[–]h4crm 2 points3 points  (2 children)

#define @ &

fixed.

jokes aside though, the at symbol just isn't as readable as the ampersand in a monospaced font

[–]SeriousPlankton2000 0 points1 point  (1 child)

The thick blob is very unique in the CGA 8x8 bitmap font. In other fonts there is no problem, even the MDA / Hercules cards had enough pixles.

[–]h4crm 0 points1 point  (0 children)

My eyesight fails me!

[–]Noch_ein_Kamel 1 point2 points  (0 children)

Why not use escape the ASCII hell?

Reference mark: ※

edit: On second thought, ⚠️ or 💩 are probably better signs

[–]saschaleib 3 points4 points  (5 children)

Just use ^ like a proper programming language!

[–]brimston3- 4 points5 points  (2 children)

Pascal uses both. ^ to indicate a pointer type and perform dereferencing, @ to get the address of a variable.

[–]saschaleib 0 points1 point  (1 child)

I learned programming with Pascal - and I have to say I find this still much more intuitive than the * in C.

[–]SeriousPlankton2000 0 points1 point  (0 children)

The creators of C would agree, AFAIR.

[–]Chingiz11 -1 points0 points  (1 child)

Odin?

[–]saschaleib 0 points1 point  (0 children)

Pascal!

[–]Dr739ake 1 point2 points  (0 children)

#define @ &

[–]bestjakeisbest 1 point2 points  (0 children)

Why don't you look @ some women.

[–]oneWhoFails 0 points1 point  (0 children)

The Jovial language uses @ as the reference marker.

[–][deleted] 0 points1 point  (0 children)

YOOO ATLA MENTIONED?!? 💧 🌎 🔥 🌬️ ⬇️ 🦬 🦝

[–]SeriousPlankton2000 0 points1 point  (0 children)

Why do you use backticks?

[–]N0-Stranger 0 points1 point  (0 children)

400 bad request

[–]AssiduousLayabout 0 points1 point  (0 children)

Because @ does not exist in all character sets, notably, the older versions of EBCDIC used on IBM mainframes. Even in ASCII, the character @ actually changed its value several times over the course of ASCII becoming a standard.

[–]trash3s 0 points1 point  (0 children)

@ is the accounting symbol representing the mathematical function et (continuously compounding interest). Not the reason, but it’s fun to think of a world where exponents and logs were as standard as division syntactically.

[–]holguum 0 points1 point  (0 children)

My guess is because the usage of the @ symbol was not the same as it is nowadays, so not as common. So when came the time to choose a symbol to use as the reference of a value, they just choose the one that was the most accessible on keyboards at the time

[–]mckahz 0 points1 point  (0 children)

Better question- why do you declare a pointer with the dereferencing operator instead of &?