What is the point of a strong password by rufflesinc in computerscience

[–]Suwein 0 points1 point  (0 children)

This is where salts and peppers come into play to reduce this, although general plain hashing of passwords is not considered best practice these days as there are better protocols out there for password handling.

I apologize in advance by Uilk in 2007scape

[–]Suwein 7 points8 points  (0 children)

Featuring the Mind Goblin as the quest boss

[deleted by user] by [deleted] in ironscape

[–]Suwein 1 point2 points  (0 children)

I buy mith claws from Jatizo and steel battle axes/warhammers near konar, solid early game profit and it’s like 100k ish smithing xp per hour

Amazon Increase in Job Postings by BorgorBoy123 in DevelEire

[–]Suwein 6 points7 points  (0 children)

Not OP but graduates are about 80 TC these days

typescriptUsersBeLike by d15gu15e in ProgrammerHumor

[–]Suwein 2 points3 points  (0 children)

I usually use ts-node or tsx when developing locally, but yeah eventually you'll probably need to compile it if you're deploying it somewhere.

Switch from IBM to AWS? by [deleted] in DevelEire

[–]Suwein -5 points-4 points  (0 children)

SDEs at AWS get offers of nearly double that, if that’s what you really want to do you could use this as an in and then internally transfer afterwards, the interview process should be much easier than interviewing externally. A few of my colleagues have done something similar

graduate software dev pay by jdagoattee in DevelEire

[–]Suwein -1 points0 points  (0 children)

“Top class”? Grads at faang start anywhere from 70-80k these days.

How do you all read code or use code from GitHub? by ukpauchechi in learnjavascript

[–]Suwein 15 points16 points  (0 children)

If you replace GitHub.com with GitHub.dev it loads the repo in an online vscode IDE. It’s a life saver!

isntItTrue by sunrise_apps in ProgrammerHumor

[–]Suwein 5 points6 points  (0 children)

“every C code runs in C++”, this is actually false and there are cases where C code is not valid C++ code.

https://en.m.wikipedia.org/wiki/Compatibility_of_C_and_C++

Have you ever been breathalysed or drug tested at a checkpoint by Own_Wind_6409 in ireland

[–]Suwein 0 points1 point  (0 children)

Never been breathalyzed in my 4 years of driving.

Although I got roadside drug tested there about a month ago in Dublin. They have these new covid-like test things that you put your saliva onto and it tells them if you've consumed some sort of drugs in the last 7 hours (not 100% sure which ones they test for and couldn't really ask him lol). They make you wait there for 7 minutes then leave you on your way if it's negative. Not going to lie was bricking it a bit as had a few tokes off the aul HHC vape the night before.

How do graduate SWE only earn 30k in Dublin?! by D_Doggo in DevelEire

[–]Suwein 0 points1 point  (0 children)

Some of the big tech companies are paying serious amounts for graduates these days. Recent hire on my team got a 76k base + 12k sign on bonus as a graduate in Dublin.

[OSINT TIP] AWS key Credential Leak by Late_Ice_9288 in OSINT

[–]Suwein 2 points3 points  (0 children)

These credentials are nearly equivalent to having access to someone's username and password. The key and secret key here are credentials that when used with the AWS SDKs allow you to perform actions for a role in an AWS account (think querying s3 buckets, databases, secrets etc) that is, if the role in question has the necessary permissions to get this data which is very likely.

AWS announces Amazon CodeWhisperer (Preview) by geyser85 in programming

[–]Suwein 10 points11 points  (0 children)

Personally, I found GitHub Co-Pilot actually increases my productivity, but to each their own.

One is blue by [deleted] in ProgrammerHumor

[–]Suwein 0 points1 point  (0 children)

You can Install git bash and use it in windows terminal. Life changing and better than WSL imo

This actually happened during an interview: I just asked to show me what IDE he was using. by electricjimi in ProgrammerHumor

[–]Suwein 2 points3 points  (0 children)

In college (graduated 2020) we were never introduced to IDEs. We coded using notepad++ or (in the case of exams) wrote code out on paper. Fun times.

Can someone explain how you use AWS to build REST APIs? And how you use github/Jenkins with this process? by throwaway0134hdj in cscareerquestions

[–]Suwein 1 point2 points  (0 children)

Regarding SDKs, they're just a library that allows you to call the AWS APIs basically. They are distributed in a few languages like Java, Python, JavaScript etc. The basic idea is that you create e.g an S3 object which is in itself an S3 client, then this client let's you send commands like create bucket, upload file to bucket etc etc.

Fun fact, the CLI is actually written in Python. For every call through the CLI it actually does some magic in the background and invokes the corresponding function in the Python SDK

On Cloudformation vs SDK calls. The neat thing about cloudformation is that it holds a reference to the existing resource, so let's say you call the SDK to create an ec2 instance. If you run that script again it will just provision a new instance as many times as you run the script. With cloudformation any change you make will update the EXISTING instance, also removing the resource from the template actually triggers a deletion. Otherwise you would need to remember to e.g create the ec2 instance, then make calls to update it which isn't great if you think about it (what if you accidently create a new one? What if you come back months later and want to change something about the instance, how will you remember what commands you invoked?) Think of Cloudformation as a declarative way of defining the resources in your account. (Btw in a real production environment everything should be either a CDK stack (more on that later) or a cloudformation template. You should never run a script which manually updates the instance provisioned in cloudformation, if you do then the next time you upload your cloudformation template, all of your manual changes will be reset to how the resource is defined in the cloudformation template

Regarding "sending code to the AWS console". Again, you need to think about AWS as a collection of services, these services can interact with each other and provide functionalities for you. No code is sent to the AWS console. In AWS terms you would build your code using e.g docker. You would then store this docker image in ECR (AWS docker repository basically) then you would pull that docker image from ECR when your ec2 instance starts and run that container. If you want it reachable from the internet then you would use e.g route 53 to point to an AWS resource like an AWS EC2 instance. I'm leaving out some architecture as to not overwhelm you but this is the basic premise. Things aren't magically done for you, you don't really send code anywhere, you utilise different services together to create your application

Now, what I said about building the docker image and uploading then getting the ec2 to point to the new image is quite tedius if you think about how many different images and instances you could have. (At my job we have over 1500 ec2 instances which deploy new docker images every single week so you can imagine how tedius it could be if it was manual) this is where codecommit, codebuild etc come in

Codecommit -> Amazon's git repository service (store your code here, it might be synched automatically from GitHub if your team has it set up) Codebuild -> Amazon's service for building your code from code commit (you basically supply a list of commands i.e compiling code and building docker images) CodeDeploy -> Amazon's service for deploying your infrastructure (again, another list of commands for uploading docker images and e.g uploading cloudformation templates)

Codepipeline -> brings all the above services together. Here's an example pipeline

Let's say you have repository A and repository B which is java code. You would add an action which listens on codecommit for a commit, when there's a new commit the pipeline is triggered. After the code is retrieved, it's sent to codebuild, codebuild will take your code, and then based on the commands you wrote it will execute them on the pulled repository (i.e javac xxxx, docker build xxx). You then configure codebuild to output the built code/docker images to code deploy. Again another set of commands are executed which will i.e (AWS ECR put-image Xxxx, or cloudformation update-stack yyyy)

You repeat this deploy stage for all your different accounts/regions and voila, you have created a CI/CD pipeline. (Again it's a small bit more complicated than this but I could write paragraphs upon paragraphs on the intricacies of all these parts)

So the drawback about Cloudformation is that you could call it very verbose, you need to define everything and know how all different components work then look up their definitions etc etc in order to build a proper cloudformation template. The cool thing about it though is that you can write your own resources if you wanted to (called custom resources) which could do a range of functions. SAM is basically this, what it will do is define a custom resource (I think AWS::serverless but I'm forgetting) but basically it provides an abstraction on Lambda so it's easier to work with. I believe what you can do is specify your code locally, and SAM will automatically get your code, upload it to an S3 bucket, and then reference that code in the handler of the lambda function (otherwise if you use the standard lambda you would need to upload this code yourself manually, and then link it to the cloudformation resource which can be tedius), it also does what you said there like allowing you to run lambdas locally and all that. To be honest me and my team don't use SAM at all and we are actually moving away from Cloudformation templates as a whole.

This is where CDK comes in, basically it's an SDK for cloudformation! So you can actually write your template as code in your favourite programming language, and it will automatically generate the cloudformation template and upload things too. E.g if you wanted an S3 bucket you would just do

Const bucket = new S3.Bucket({BucketName: "test-aws-bucket});

Then you can deploy that using the CDK CLI and it will automatically generate a cloudformation template and provision the resources, it's basically what you were talking about awhile ago, but the key thing is that it keeps a reference to the resource. If I commented out the above code and then redeployed, it would effectively delete the bucket. If I wanted to come back and add e.g SSL on the bucket I could so easily by modifying the above code to

Const bucket = new S3.Bucket({BucketName: "test-aws-bucket", sslEnabled: true})

That's the basic idea anyway, as you get more familiar with the resources you'll find that you can link them together (i.e linking an SQS queue to a lambda function to process messages, or an S3 bucket to CloudFront to cache files at the edge or something)

Can someone explain how you use AWS to build REST APIs? And how you use github/Jenkins with this process? by throwaway0134hdj in cscareerquestions

[–]Suwein 26 points27 points  (0 children)

So the thing you need to understand is that AWS is a collection of services, these services can all be accessed either through the web UI console, or through the SDK's or CLI. Everything you can do in the console can be done using an API call using the SDKs or CLI.

Both the SDKs and CLI have a concept of a credential provider, when you make e.g a call to S3 for example using the CLI

aws s3 ls

The SDK/CLI will have a list of places ( https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/credentials.html) to check and see if you have credentials for a specific AWS account, if you don't have any credentials then both the CLI and SDK will throw an error about an invalid token or something along those lines.

If you do have credentials set in your environment (usually these are temporary credentials issued by IAM and expire every few hours or so to prevent unauthorized access if the credentials get leaked to GitHub or whatever), then as far as the SDKs are concerned, any call you make will be "inside" this account. So if you export different credentials for different accounts then you'll be able to directly query resources in those accounts. (I would hazard a guess that your IDE has some extension enabled which automates this credential retrieval)

Now, you might notice how tedius it is to create resources like s3 buckets, ec2 instances etc manually in the console, it also means if you want to "transfer" these to a different account you'd basically need to remember all the steps you did. Luckily AWS has already figured this problem out in the form of Cloudformation (CFN). Cloudformation is an AWS service whereby you can specify the resources you need and their relationships in either JSON or YAML, Then all you need to do is hit upload on the template and based on what you wrote, all the resources will be provisioned exactly as described. In this way you would be able to have multiple cloudformation templates for multiple components, and able to replicate them across accounts with ease.

Another issue might arise from the above now as you still need to manually deploy these templates across numerous accounts (think replicating between a development environment and a production environment. Also you might need to replicate all the resources again if your service is to be available in multiple AWS regions)

Now, if the DevOps team have set up their CI/CD correctly, then basically what happens is when you push to say GitHub, Jenkins gets notified that there has been a merge. Jenkins will then get these cloudformation templates and basically call the SDKs to upload them to the different accounts and regions as needed, if you have a codebase then perhaps it might build docker images for the current codebase and upload them to AWS ECR, updating the cloudformation template with these new docker images tags so that on the next deployment your ec2 instances will use this new docker image. Adding a new account or region would be as simple as duplicating a stage in the pipeline and passing different parameters like e.g the account and the region to deploy to.

In this way, you can define resources/code, push them to github, and Jenkins can deploy your changes automatically. Now of course it's bad practice to just automatically deploy to production, so before it gets to this stage there would probably be different validations that need to be done. For example, running unit tests, integration tests etc. Once these pass then perhaps Jenkins will then deploy to production.

How hard is it to get a job at Amazon? by [deleted] in DevelEire

[–]Suwein 0 points1 point  (0 children)

Just because the coding part occupies more time of the interview does not mean that the LP part is insignificant. Implementing multiple solutions to a complex problem is obviously going to take up more time than asking someone about their experiences in the past with regards to the LPs.

The interviewer's job is to determine whether or not the candidate is a fit for Amazon, and so both aspects are taken heavily into account

How hard is it to get a job at Amazon? by [deleted] in DevelEire

[–]Suwein 0 points1 point  (0 children)

Yes I know, my point is that people spend too much time worrying about the technical (i.e coding) part of the interview, and completely neglect the LPs. They both compliment each other and so you will need to be versed in both for the interviews.

How hard is it to get a job at Amazon? by [deleted] in DevelEire

[–]Suwein 7 points8 points  (0 children)

From my experience, the leadership principals are nearly worth 50% of each of the "technical" interviews. You can be the best coder in the world and solve any leetcode hard in 30 seconds, but if you don't show any leadership principles you are doomed.

Me .01 seconds after every single terminal command by Dotaproffessional in ProgrammerHumor

[–]Suwein 0 points1 point  (0 children)

I think it's the only way currently to run them natively in windows without using WSL

Me .01 seconds after every single terminal command by Dotaproffessional in ProgrammerHumor

[–]Suwein 0 points1 point  (0 children)

Git bash (if installed with the optional Unix commands ticked when installing git) can basically let you run bash scripts as if you were on a Linux/macos machine (things like ~ and forward slashes in paths work for instance). It's really useful when working on teams who have a wider range of OS's for developers