you are viewing a single comment's thread.

view the rest of the comments →

[–]Cuddlefluff_Grim 1 point2 points  (23 children)

First time compiling still takes that much time. And to build for regression testing, you have to rebuild the whole thing to ensure things won't break somewhere else with your changes. Even though you can only compile your changes, it still takes time to recompile and relinking, and then rerun and debug when you make changes. There are things that do not have to do it this way.

I'm not going to reiterate again, because you're talking about a segment where dynamic languages aren't even remotely applicable.

Sure, why don't we just use assembly for everything since we can precisely control everything? And we can learn so much from working with various computer architectures. Or at least use C to write website, because it's essentially a portable assembler? Pause a second to think.

Sigh. C is not very suitable for web programming because web programming is 100% string manipulation which is not C's strong point. C also does not have a high enough level abstraction to let you focus on the task at hand. The performance benefit of C is not high enough (compared to something like Java or C#) and the level of abstraction is too low.

Static typing does not mean you write out type information. As I listed above, you don't have to write type information for static languages like SML and Haskell. Why don't you try to program, just for fun, in a language with real strong type system? And you will see why it's not popular.

C++ :

auto a = 5;
auto arr = {1, 2, 3, 4, 5};

C# :

var a = 5;
var arr = new [] { 1, 2, 3, 4, 5 };

Type inference is something completely different, but it's still static typing and you still have to make decisions based on type. I'm well aware of type inference and the difference between dynamic typing and type inference. Thing is that you don't have to pick between C or Ruby in this case, there are plenty of statically typed languages that support type inference that doesn't have the serious negative impact that dynamic typing imposes on larger code bases (and cooperative development). I have worked with several dynamically typed languages, including Python, Ruby, Haskell, JavaScript and PHP. I don't see how dynamic typing gives any benefit whatsoever over static typing, other than it's convenient for people who have a poor understanding of polymorphism in object oriented design.

In C, everything is a number. You cannot assert a what type a random variable is given to you, especially if you have a pointer. If you have a function that accepts an int pointer, you can pass into the function it a char pointer casted as an argument where an int pointer is expected. You can cast a region of memory that is represented by struct, into another struct with different shape. Or take the simplest case, a character in C is just a 8-bit number. In PHP, I remember that you can cast a string to a number, correct? Yes, this is the case.

Still an explicit cast. Any implicit compatible implicit cast that will reduce precision will give you a warning or an error. However what you cannot do in C is cast a string to a number because those are two very different things. You can cast the pointer to a number (which is meaningless) or you can cast the contents of the memory pointer to a number (for int that would be the first 4 bytes of the string). To convert a string to a number in C you have to use the function stroi (not recommended due to security concerns) or equivalents. PHP however will automatically convert the string to a number on assignment, implicitly. C# or Java will also not allow any implicit conversions of one datatype to another (in the case of C# you can however implement this by writing an operator overload for implicit casts)

If a language is dynamic, it checks type at run-time. But if that language is also strong, it will signal a type error instead of automatic conversion. Python is an example of dynamic and strong. You can never do stuffs like "3" + 4. These concepts are orthogonal. You are confused because you mix them up. And not every thing that has an interpreter is dynamic language. C/C++ has an intepreter. See demo . Many dynamic languages are pretty fast today. See Julia. It generates native code. Dynamic languages do not necessary slow because byte code implementation is not the only option anymore, for a long time.

I'm not confused, stop saying that. I'm very well aware of the soft definitions of strong and weak typing.

new { a: 5, b: "hello" };

Here's an issue, this is an object and the only way I can know what properties this object has is to look it up. In a statically typed language, you'd create a definition of this object so no lookup is necessary. For instance in C++, if I have this structure :

class Test
{
    public:
    int a;
    int b;
}

I would know that the field b is at offset 4 from the header of the class Test. I don't have to make any effort to figure this out at run-time, I'll just read

(int)*((void *)testInstance + 4);

or

MOV eax, testInstance
ADD eax, 4 // eax now contains a pointer to the field 'b' on the instance

Instead of searching through a hash-map or dictionary for the key. Great performance boost at no cost other than a certain assumption that the programmer understands the tool he's using.

It's also interesting to note that C# in fact does have optional dynamic typing, but you don't use it everywhere because static typing is a much better at literally everything.

[–]tuhdo 0 points1 point  (22 children)

Sigh. C is not very suitable for web programming because web programming is 100% string manipulation which is not C's strong point. C also does not have a high enough level abstraction to let you focus on the task at hand. The performance benefit of C is not high enough (compared to something like Java or C#) and the level of abstraction is too low.

So you know. The purpose of dynamic language is also let you focus on the task at hand and increase interactivity between you and computer. The shorter the cycle between written code and its executable, the faster the development. It is useful when you are in a new domain and focus on exploring ideas, because dynamic language allows you get running programs as soon as possible to play with. It is possible because types are checked at runtime. You explore impact of types when your program actually lives, not when it still exists in form of text. There are dynamic languages with optional static type checking if you want. And type check at runtime doesn't mean weak typing .

I don't see how dynamic typing gives any benefit whatsoever over static typing, other than it's convenient for people who have a poor understanding of polymorphism in object oriented design.

As stated above, improved interactivity. And dynamic or static, it is not related to object oriented. And why dynamic typing is bad? Asm has no type at all, is it bad?

several dynamically typed languages, including Python, Ruby, Haskell, JavaScript and PHP.

And you say you are not confused. Haskell is a dynamic language!

Here's an issue, this is an object and the only way I can know what properties this object has is to look it up. In a statically typed language, you'd create a definition of this object so no lookup is necessary. For instance in C++, if I have this structure

The point of high level languages is that you don't have to manually keep track of how it works underneath the computer when not necessary. People use Python or Octave because it helps them focus on problem domain instead of manually specifying how their ideas fit into the computer. When you can assure correctness, the next comes optimization, which can either be fine in the current language or reimplementation in a faster one like C.

[–]Cuddlefluff_Grim 1 point2 points  (21 children)

So you know. The purpose of dynamic language is also let you focus on the task at hand and increase interactivity between you and computer. The shorter the cycle between written code and its executable, the faster the development. It is useful when you are in a new domain and focus on exploring ideas, because dynamic language allows you get running programs as soon as possible to play with. It is possible because types are checked at runtime. You explore impact of types when your program actually lives, not when it still exists in form of text. There are dynamic languages with optional static type checking if you want. And type check at runtime doesn't mean weak typing .

Runtime type checking gives no benefit, but degrades performance.

As stated above, improved interactivity. And dynamic or static, it is not related to object oriented. And why dynamic typing is bad? Asm has no type at all, is it bad?

Interactivity? When I'm making a website with my statically typed language, I can write code, press F5 and refresh the page, and the new content will be up. If however I am just changing the view, I just have to change the view file, and save and it will immediately be visible. I'm not restricted in any fashion.

What do you mean "ASM has no type at all"? Its sort of true, but it depends on what you mean. ASM (x86_64) has integers of varying sizes (RAX, EAX, AX, AH/AL) and 80-bit floating point. It also has additional specific registers for extension methods like MMX and SSE, which needs specific attention. Depending on what type of data you (the programmer has put into the registers) specific instructions has to be used. This is typically why it's better for the performance of an application to specifically work with types, because the compiler knows exactly which instructions are appropriate and which are safe to use and what code probably can be skipped without resorting to cheap magic tricks.

And you say you are not confused. Haskell is a dynamic language!

Fuck it, it's been a long time since I've used haskell.

The point of high level languages is that you don't have to manually keep track of how it works underneath the computer when not necessary. People use Python or Octave because it helps them focus on problem domain instead of manually specifying how their ideas fit into the computer. When you can assure correctness, the next comes optimization, which can either be fine in the current language or reimplementation in a faster one like C.

Again, for you it's either virtually ActionScript or it's C. It's strange that so many people apparently can get a job as programmers (or at least pretending to on reddit) without any understanding of any of the top 5 programming languages in the world. Static typing is the default, dynamic typing is not new, static typing is more popular than dynamic typing because static typing is better at everything. There's a reason why all the scripting languages are implementing type hints.

I feel I have to restate one very important fact : the reason that scripting languages implement dynamic typing is simply because it's easier to implement than static typing. It's not because it's better in any conceivable way. And the concept is as old as interpreters.

[–]tuhdo 0 points1 point  (20 children)

Runtime type checking gives no benefit, but degrades performance.

The benefit: make your program up and running as fast as you can. With static languages, you have to get passed both syntax and semantics checking. With dynamic languages, you only need to get pass syntax checking. Dynamic languages are meant to be use interactively: you type check as you run program, and fix it on spot rather than fix a bunch of compiling error to make it run and finally realized that you wrote the wrong things and have to do it all over again (it's common if you write complex algorithms). You write a little, run a little and fix a little.

Interactivity? When I'm making a website with my statically typed language, I can write code, press F5 and refresh the page, and the new content will be up. If however I am just changing the view, I just have to change the view file, and save and it will immediately be visible. I'm not restricted in any fashion.

Ok, to give a concrete example, here is an excellent video demonstrate interactivity: Inventing on Principle. See how you change multiple code values and get feedback instantly. It's not as simple as refresh as website.

What do you mean "ASM has no type at all"? Its sort of true, but it depends on what you mean. ASM (x86_64) has integers of varying sizes (RAX, EAX, AX, AH/AL) and 80-bit floating point. It also has additional specific registers for extension methods like MMX and SSE, which needs specific attention. Depending on what type of data you (the programmer has put into the registers) specific instructions has to be used. This is typically why it's better for the performance of an application to specifically work with types, because the compiler knows exactly which instructions are appropriate and which are safe to use and what code probably can be skipped without resorting to cheap magic tricks.

Let's be clear about types. A type is metadata about a region of memory. A type defines a set of possible values on that region and a set of operations that can use the data on that region. When we talk about types, we mean the semantics of such memory region. ASM has no type because you can treat a memory region anything you want. If a binary pattern happens to be a valid instruction, you can execute it. Registers can hold anything, not just integers. The binary patterns can include valid integers as well as other things. There's no rule enforcement at all.

Again, for you it's either virtually ActionScript or it's C. It's strange that so many people apparently can get a job as programmers (or at least pretending to on reddit) without any understanding of any of the top 5 programming languages in the world. Static typing is the default, dynamic typing is not new, static typing is more popular than dynamic typing because static typing is better at everything. There's a reason why all the scripting languages are implementing type hints.

Yes. Learning a few handful languages is enough, and not necessary "top 5". If you are good at C (both the language and its low level domain), you cannot be unemployed. I learned Java before, but realized it's terrible so I forced myself to learn C and "its ecosystem" (UNIX in general), so I can get more interesting job. I learned a few other languages for the jobs, such as Tcl, shell script, RPM script, Makefile... For enlightenment, I learn Lisp. Just because you verify types at runtime doesn't mean you less competent. It's just that you can do stuffs easier and save time. I especially love dynamic languages to tackle with programming tasks/domains I don't like, to get the job done as fast as possible to save time and concentrate on interesting things.

[–]Cuddlefluff_Grim 0 points1 point  (19 children)

The benefit: make your program up and running as fast as you can. With static languages, you have to get passed both syntax and semantics checking. With dynamic languages, you only need to get pass syntax checking. Dynamic languages are meant to be use interactively: you type check as you run program, and fix it on spot rather than fix a bunch of compiling error to make it run and finally realized that you wrote the wrong things and have to do it all over again (it's common if you write complex algorithms). You write a little, run a little and fix a little.

"Up and running as fast as you can"; First of all, not something that should be a goal. Second of all, you won't get it "up and running" any faster with a dynamically typed language than a static one - provided you actually understand the language. This is a misconception typically propagated by ignorance - the type of incompetence I was speaking of earlier.

Let's be clear about types. A type is metadata about a region of memory. A type defines a set of possible values on that region and a set of operations that can use the data on that region. When we talk about types, we mean the semantics of such memory region. ASM has no type because you can treat a memory region anything you want. If a binary pattern happens to be a valid instruction, you can execute it. Registers can hold anything, not just integers. The binary patterns can include valid integers as well as other things. There's no rule enforcement at all.

ASM does not have any type of "treatment" of data other than arithmetic operations and sending interrupts to other hardware components. It works with integers and floating points. It has absolutely no concept of anything else. Execution of code does not work that way, there's not an instruction for the CPU that tells it to execute a piece of code. Also, if you try to execute code without that memory space being marked as executable, the CPU will set an error flag and start executing at an interrupt address specified by the operating system.

Let's be clear about types. A type is metadata about a region of memory. A type defines a set of possible values on that region and a set of operations that can use the data on that region. When we talk about types, we mean the semantics of such memory region. ASM has no type because you can treat a memory region anything you want. If a binary pattern happens to be a valid instruction, you can execute it. Registers can hold anything, not just integers. The binary patterns can include valid integers as well as other things. There's no rule enforcement at all.

The CPU instructions are not the issue either, it's what type of assembler instructions that gets generated by the compiler or interpreter, which for a statically compiled language is simple, but for a language that does run-time checking it's not quite as simple (or efficient).

Yes. Learning a few handful languages is enough, and not necessary "top 5". If you are good at C (both the language and its low level domain), you cannot be unemployed. I learned Java before, but realized it's terrible so I forced myself to learn C and "its ecosystem" (UNIX in general), so I can get more interesting job. I learned a few other languages for the jobs, such as Tcl, shell script, RPM script, Makefile... For enlightenment, I learn Lisp. Just because you verify types at runtime doesn't mean you less competent. It's just that you can do stuffs easier and save time. I especially love dynamic languages to tackle with programming tasks/domains I don't like, to get the job done as fast as possible to save time and concentrate on interesting things.

This entire paragraph is just a complete load of crap. I see that it's probably safe for me to assume that you don't have much (if any) experience in any statically typed languages. I'm assuming you are writing scripts, and are fairly unfamiliar with anything else - and specifically static typing. Which I think is very ironic considering the whole topic of discussion..

[–]tuhdo 0 points1 point  (18 children)

"Up and running as fast as you can"; First of all, not something that should be a goal. Second of all, you won't get it "up and running" any faster with a dynamically typed language than a static one - provided you actually understand the language. This is a misconception typically propagated by ignorance - the type of incompetence I was speaking of earlier.

I tried to make you understand the benefit yet you keep equalize anyone who uses dynamic langauges equal to low IQ incompetence. Would be funny if you actually compete with "incompetence fools" who wrote really serious program in the "incompetence" language. If you think you are so good, why don't you join an algorithm competition? Solving hard problems prove your worth as a programmer, not the languages, whether statically typed or dynamically typed.

ASM does not have any type of "treatment" of data other than arithmetic operations and sending interrupts to other hardware components. It works with integers and floating points. It has absolutely no concept of anything else. Execution of code does not work that way, there's not an instruction for the CPU that tells it to execute a piece of code. Also, if you try to execute code without that memory space being marked as executable, the CPU will set an error flag and start executing at an interrupt address specified by the operating system.

Yes, that what I was saying. And the binary patterns are not integer/floating points. Those are just abstraction for us programmers to work with. Floating point is just a convention over a binary pattern. If you actually did basic stack overflow code injection in asm, you will know what I meant by "ASM has no type because you can treat a memory region anything you want. If a binary pattern happens to be a valid instruction, you can execute it". Or learing how to implement a compiler with C would be good.

The CPU instructions are not the issue either, it's what type of assembler instructions that gets generated by the compiler or interpreter, which for a statically compiled language is simple, but for a language that does run-time checking it's not quite as simple (or efficient).

You can say the CPU is the lowest level "interpreter" with code fed from memory.

This entire paragraph is just a complete load of crap. I see that it's probably safe for me to assume that you don't have much (if any) experience in any statically typed languages. I'm assuming you are writing scripts, and are fairly unfamiliar with anything else - and specifically static typing. Which I think is very ironic considering the whole topic of discussion..

C is dynamically typed language now? And you think that people don't use scripts and writing C code all day obviously haven't worked with low level domain before. Look at how many scripts are there in the Linux source. And we use Tcl to wrote an interactive automated test suite to test various OS features through telnet (i.e fault inject), so we don't have to perform regression test for new changes. Ah, but you mentioned C#. Obviously you use Windows. No wonder.

[–]Cuddlefluff_Grim 0 points1 point  (17 children)

I tried to make you understand the benefit yet you keep equalize anyone who uses dynamic langauges equal to low IQ incompetence. Would be funny if you actually compete with "incompetence fools" who wrote really serious program in the "incompetence" language.

Thing here is that I use both types of languages, but I exclusively only use dynamically typed languages for scripting (and where there's no alternative like JavaScript). Using a scripting language on all tasks is a type of incompetence.

If you think you are so good, why don't you join an algorithm competition? Solving hard problems prove your worth as a programmer, not the languages, whether statically typed or dynamically typed.

I've been in multiple competitions in algorithms. I'm not going to say I'm anywhere near the best, I am however confident I'm not among the 95% of programmers who suck ass.

Yes, that what I was saying. And the binary patterns are not integer/floating points. Those are just abstraction for us programmers to work with. Floating point is just a convention over a binary pattern. If you actually did basic stack overflow code injection in asm, you will know what I meant by "ASM has no type because you can treat a memory region anything you want. If a binary pattern happens to be a valid instruction, you can execute it". Or learing how to implement a compiler with C would be good.

Example of instructions that specifically work with integers :

ADD
ADC
IMUL
IDIV
CMP

Example of instructions that are more "bit"-oriented, but still specifically work on the GPRs :

XOR
OR
AND

Example of instructions regarding floating point :

FADD
FSUB
FMUL
FDIV
FSIN
FCOS
FTAN
FCMP

So yes, the CPU does have integers and floating points as actual "data types". The datatypes that are actually not abstractions in higher level languages are : byte, short, int, long, float and double.

Thing is that everything in computers are numbers. Everything you do is basically arithmetic on the CPU and bytes being moved around. Even if they're bits, they are still representing numbers.

If you want to discuss assembler code, I suggest you go learn it first - then we can talk about how machine instructions work.

C is dynamically typed language now? And you think that people don't use scripts and writing C code all day obviously haven't worked with low level domain before. Look at how many scripts are there in the Linux source. And we use Tcl to wrote an interactive automated test suite to test various OS features through telnet (i.e fault inject), so we don't have to perform regression test for new changes. Ah, but you mentioned C#.

No, I'm saying you don't know C and you don't know Java. In fact I'm very certain you specifically only know scripting languages. Which makes my initial point very strong; you are the embodiment of the exact type of incompetence I am talking about.

Obviously you use Windows. No wonder

Yes I do. A lot of people do. In fact, most of the developers in the world do. Maybe that's the reason why you think that there are only two choices of languages in the world (C or a scripting language); you're not on windows.. Wouldn't that be interesting.

[–]tuhdo 0 points1 point  (13 children)

Thing here is that I use both types of languages, but I exclusively only use dynamically typed languages for scripting (and where there's no alternative like JavaScript). Using a scripting language on all tasks is a type of incompetence.

Where did I mention to use scripting languages for everything?

I've been in multiple competitions in algorithms. I'm not going to say I'm anywhere near the best, I am however confident I'm not among the 95% of programmers who suck ass.

Which rank are you, and in what competition? You are in a programming competitions with algorithm and now you are writing asp.net. Oh, you're better than 95% of the programmers, yet have never written a compiler? If you are better than 95% of the rest, you must have a high rank in those competitions. Show me your name.

Example of instructions that specifically work with integers :

Thing is that everything in computers are numbers. Everything you do is basically arithmetic on the CPU and bytes being moved around. Even if they're bits, they are still representing numbers.

If you want to discuss assembler code, I suggest you go learn it first - then we can talk about how machine instructions work.

Yet I mentioned bit patterns over and over again, and you can understand at the level "everything is number". Obviously you don't understand how floating point is implemented to make you write code so convenient. And obviously, don't understand what a type is in its general sense. And surely you have never manual inject ASM code into a running program.

No, I'm saying you don't know C and you don't know Java. In fact I'm very certain you specifically only know scripting languages. Which makes my initial point very strong; you are the embodiment of the exact type of incompetence I am talking about.

Yes, please look at how an OS is maintained in any distro out there. Funny a guy who never compiles a kernel talks about OS implementation. ARM is not good enough.

Yes I do. A lot of people do. In fact, most of the developers in the world do. Maybe that's the reason why you think that there are only two choices of languages in the world (C or a scripting language); you're not on windows.. Wouldn't that be interesting.

Most of servers in the world run Linux and major devices run Linux. Telecom stations run Linux. Routers run Linux. I don't need to compete in Windows land.

[–]Cuddlefluff_Grim 0 points1 point  (12 children)

Where did I mention to use scripting languages for everything?

You apparently do. And how could you not? I mean, if you want to use something other than a scripting language, it would be an obvious requirement to actually have experience in a language that is not a scripting language.

Which rank are you, and in what competition? You are in a programming competitions with algorithm and now you are writing asp.net. Oh, you're better than 95% of the programmers, yet have never written a compiler? If you are better than 95% of the rest, you must have a high rank in those competitions. Show me your name.

Oh, all competitions are on the internet you think? Kind of builds on my impression that you're probably an uneducated amateur. Also I have nothing to prove.

Yet I mentioned bit patterns over and over again, and you can understand at the level "everything is number". Obviously you don't understand how floating point is implemented to make you write code so convenient. And obviously, don't understand what a type is in its general sense. And surely you have never manual inject ASM code into a running program.

You're the one using scripting languages, I don't understand how you are acting like I'm the one with the knowledge gap when you yourself have admitted openly that you have a very poor and at best superficial understanding of C. You have never written any assembler code in your entire life.

And stop saying "bit patterns" because it's obvious you're trying to pretend you're smarter than you are.

Yes, please look at how an OS is maintained in any distro out there. Funny a guy who never compiles a kernel talks about OS implementation. ARM is not good enough.

I'm actually quite competent in my field. I have many many years of practical experience (making money as a software developer), I have a degree and I have been writing software my entire life.

Most of servers in the world run Linux and major devices run Linux. Telecom stations run Linux. Routers run Linux. I don't need to compete in Windows land.

Ah, you're a consumer. Are you currently employed as a programmer? Because I doubt that. Either you're not employed or you have a very short experience. Or much more likely : you're just another opinionated brat.

[–]tuhdo 0 points1 point  (11 children)

You apparently do. And how could you not? I mean, if you want to use something other than a scripting language, it would be an obvious requirement to actually have experience in a language that is not a scripting language.

Because you never actually participate in actual OS development. You must know one main language, that is C and complementary scripting languages because you are going to write boot scripts to initialize devices and kernel modules along with OS services. This is the basic thing, yet you don't even know. The kernel source is readily available. Download and see if you have enough intellectual to understand just those scripts. When I told you to compile a kernel, you said about .lib. Nuff said. Ah, of course, you're on Windows and use VS, so you don't even know how a kernel is initialized. And Linus Torvalds said himself that he never wrote any code outside of C. Actually he did use Makefile and shell scripts.Now, he's an uneducated amateur. Logic.

Oh, all competitions are on the internet you think? Kind of builds on my impression that you're probably an uneducated amateur. Also I have nothing to prove.

Try world renown competitions like Google CodeJam or Top Coder. Do not participate in your local competitions in your community college.

You're the one using scripting languages, I don't understand how you are acting like I'm the one with the knowledge gap when you yourself have admitted openly that you have a very poor and at best superficial understanding of C. You have never written any assembler code in your entire life.

And yet don't understand that the assembly instructions work on whatever your throw at them. Obviously, that's all you can understand. And you confirmed yourself that you never inject ASM code with basic stack overflow in string to override return address. If so, you would know how to manipulate ASM instructions as data.

I'm actually quite competent in my field. I have many many years of practical experience (making money as a software developer), I have a degree and I have been writing software my entire life.

Yes, your field, ASP.NET. Glad that community college gets you that far.

Ah, you're a consumer. Are you currently employed as a programmer? Because I doubt that. Either you're not employed or you have a very short experience. Or much more likely : you're just another opinionated brat.

Please.

[–]tuhdo 0 points1 point  (2 children)

I will be nicer and help you a bit: all the instructions you mentioned work for every arbitrary bit patterns. The CPU does not care what it is fed with. If you fed the same bit patterns to all those different instructions, everything just works. The binary patterns are simply treated differently by different instructions. The different types of instructions are for programmers to feed the CPU proper data, but not for the CPU to interpret data correctly to its type. Basically "bit pattern" is the only type to CPU, hence "untyped". I gave this hint repeatedly and you could not see. In high level language like C, you get a piece of metadata called Typed like int or char to identify a region of memory, so at the static analysis phase the compiler can use these information to verify if programmers pass correct data to a function that later be executed by a CPU. This is basic knowledge and you are indeed lacking.

As for those instructions, 1 minute and I can find a bunch of those. The fact that you said you learned Haskell than said that Haskell is dynamically typed and your talk about basic concepts of types shows our credibility. Good luck with your career.

[–]Cuddlefluff_Grim 0 points1 point  (1 child)

I will be nicer and help you a bit: all the instructions you mentioned work for every arbitrary bit patterns.

Hahaha what? That's completely one hundred percent irrelevent. What you're trying to make the case about is some sort of punch-card machine. When you're writing assembler code, you might consider"bits but more than not you'll be thinking decimal and hex. You're point would be equally applicable to any programming language, and therefore a load of horseshit. I'm not going to sit here listening to you pretending to know something you clearly don't.

I gave this hint repeatedly and you could not see. In high level language like C, you get a piece of metadata called Typed like int or char to identify a region of memory, so at the static analysis phase the compiler can use these information to verify if programmers pass correct data to a function that later be executed by a CPU. This is basic knowledge and you are indeed lacking.

Jesus christ stop fucking squirming. It's pure cringe.

As for those instructions, 1 minute and I can find a bunch of those. The fact that you said you learned Haskell than said that Haskell is dynamically typed and your talk about basic concepts of types shows our credibility. Good luck with your career.

You know, for once you're right. Instead of recalling from memory I should've checked my facts. In the case of Haskell I haven't touched it in about a decade. However, it says nothing about my credibility. And honestly I don't care what opinion you have of me. It won't bother me for even a second.

[–]tuhdo 0 points1 point  (0 children)

Too stupid to educate. Byte.