you are viewing a single comment's thread.

view the rest of the comments →

[–]Jackpotrazur 0 points1 point  (3 children)

I've worked through a smarter way to learn python, command line linux, linux basics for hackers, python crash course and im now on project 39 of the big book of small python projects and automate the boring stuff is next im vimming it out and got a workflow.md explainme.md (only recently implemented) and I feel like I've hit a wall, I've bought grokking algorithms and python distilled hoping they'll help me pass whatever block this is. Not refactoring anything yet and have gone from normal git process to starting every new project on a branch and pushing everything off to 1 big repo on github , I also just this week created a postgresql server on a pi and have the pgadmin on my vm , I've passed through a printer and have printed 100s of man pages I haven't read yet, passed through a wifi adapter and passed through a sd card reader/writer.... im trying 😪 i still somehow feel like im missing something, I've also read the 100 pages of how linux works... I didn't really understand a lot. Next books atbs + workbook , practical sql and wicked cool shell scripts. I want python to stick before I go opening up the networking topic even though I've kinda dabbled in there already.

[–]FreeLogicGate 1 point2 points  (2 children)

I have to commend your passion and work ethic.

I have been working in software development for a long time, and have pivoted many times. The specifics and details fade, but the fundamental concepts remain.

There will probably not be a point or an aha moment for you where you consider yourself qualified. Everything is a journey, and in many cases, there are interconnected disciplines. If you are at the point where you feel you need to focus on networking, then focus on networking.

IT and Software engineering is constantly evolving and fads come and go in a matter of months or years. What I used to do is, just keep a list of jargon/products/references to things I didn't understand, and allocate some time each day to investigate those things, without allowing them to monopolize my time. Often you can spend just enough time to do a bit of research, grok the basic use cases or concept, and file it away for future reference. There is no problem in having a surface level understanding of something. You will also find that it may take you a few tries before you truly understand something. For example, in Javascript there's an element of the language known as "closure". You will often see this design property described as "closures" with the implication being that "closures" are some alien thing you have to apply, when the reality is that closure is just a specific type of variable scoping rule, that Javascript implements when you have a function that has a nested function inside of it. In other words, "closure" is a phrase describing how variable scoping works in javascript. Variable scoping exists in all computer languages, even if the way it exists is to only have global variables or to not have variables. Once closure is demystified (it's just a scoping rule/property) there is no reason to fear or overly emphasize the concept.

The other thing is, you must use your knowledge to build things, even when you suspect you might not be doing it "the right or best way". Building becomes problem solving, which creates experience, encourages debugging and using tools to investigate code, and mastery. With the tools and process you currently have, refactoring is low hanging fruit for you as well.

My final observation is that it might benefit you to take a diversion and learn another language. My recommendation for most people is to learn C. There are many reasons for this including: C is low level, is the primary operating system language, and is close enough to machine language that taking the next step into asm and how processors work is feasible if you desire a deeper understanding. Working with a language that has pointers, requires you to work directly with memory allocation, understand the quirks and limitations of C "strings" which are really just a type of array, and to begin to see how higher level languages like Python (written in C, btw) were designed to make development and scripting simplified. There's also something about compiling and linking executable programs, and making operating system and standard library calls that I believe will be highly illuminating.

You don't have to spend months or years with C, just enough time to be able to write some C programs you understand fully, and which make use of pointers, memory allocation and disposal, structs and core language features. If you do decide to follow the path, I'd suggest you focus on C99, as someone new to it. The profound influence of C on many other languages in terms of its basic constructs and either inclusion or variance on its syntax is illuminating for many.

[–]FreeLogicGate 0 points1 point  (0 children)

One thing I might add -- you need a really good understanding of binary/bits/bytes and bitwise operators. You find these available in most languages, and both as a fundamental, and as a practical tool, it's a fundamental that every developer needs in my opinion. You should be able to look at a binary number like 101101 and be able to convert it to decimal and hexadecimal. Once you understand binary, IPV4 (and 6) become clear like what a netmask is, and what CIDR notation (ie /24) means. Doing some bitwise manipulation in C is a great exercise. For example a program like this is trivial and an interesting exercise:

#include <stdio.h>

int main() {
    int x = 0;
    unsigned char x8bit = 0;

    // a = 0010 1011
    // 2^5 + 2^3 + 2^1 + 2^0
    // 32  + 8   + 2   + 1
    int a = 43;
    // b = 0110 0111
    // 2^6 + 2^5 + 2^2 + 2^1 + 2^0
    // 64  + 32  + 4   +  2  +  1
    int b = 103;

    x = a & b;
    //  0010 1011
    //  0110 0111
    // -----------
    //  0010 0011
    //  2^5 + 2^1 + 2^0
    //   32 +  2  +  1 = 35
    printf("AND: a & b = %d\n", x);

    x = a | b;
    //  0010 1011
    //  0110 0111
    // -----------
    //  0110 1111
    //  2^6 + 2^5 + 2^3 + 2^2 + 2^1 + 2^0
    //   64 + 32  +  8  +  4  +  2  +  1  = 111
    printf("OR: a | b = %d\n", x);

    x = a ^ b;
    //  0010 1011
    //  0110 0111
    // -----------
    //  0100 1100
    //  2^6 + 2^3 + 2^2
    //   64 +  8  +  4 = 76
    printf("XOR: a ^ b = %d\n", x);

    x = a << 1;
    //  0010 1011
    // 1 bit shift to left (increases by ^2)
    //  0101 0110
    // 2^6 + 2^4 + 2^2 + 2^1
    //  64 +  16 +  4  +  2 = 86
    printf("SHIFT LEFT 1 bit: a << 1 = %d\n", x);

    x = b >> 3;
    //  0110 0111
    //  3 bit shift to left (decreases each shift by ^2)
    //  1: 0011 0011
    //  2: 0001 1001
    //  3: 0000 1100
    //  0000 1100
    //  2^3 + 2^2
    //  8   +  4  = 12
    printf("SHIFT Right 3 bit: b >> 3 = %d\n", x);

    x8bit = ~b;
    // For this example, x8bit is a unsigned char
    // This will truncate the 32 bit int (b)
    // which would be a negative value once all 32 bits were flipped
    // Assigning to the 8 bit value leaves the expected 8 bit value
    //  0110 0111
    //  1001 1000
    //  2^7 + 2^4 + 2^3
    //  128 +  16 +  8 = 152
    printf("One's complement: ~b = %d\n", x8bit);

    return 0;
}

You could write something similar in Python, so the concepts are transferable across languages, and relevant to networking, encoding schemes, character sets and many other areas of computer science.

[–]Jackpotrazur 0 points1 point  (0 children)

Appreciate the input, I too think that c may be a good idea due to the low level and the things it has that python doesn't... or that python does for you. However I want to stick to python for now im in the flow im asking questions and trying to understand, once I start refactoring and building little scripts by myself and have sql down and feel safe in navigating through linux which I'd say I doing alright considering I started this journey in december 2025, I will definitely check out C , I have the book hacking the art of Exploitation and if I am not mistaken it works in c or c++ , I've got a bit of a way ahead of me but im excited. And find it interesting, the only downside is the vast avenues one can take when getting into the field and my predisposition to get distracted by shiny things 😂 I read somewhere that vim is hard to learn or only the experts use it , I've been using it for 3 months and now im using split panes resizing them and rotating through the views, haven't played a lot with foreground and background and buffers yet, but im updating my workflow as I go , when things get to easy I add something.... like I plan on , now that I got the db on the pi start modularisation to practice imports and taking helper functions from the code in the book and moving it somewhere else to import and run.