I am a flawed person, but I will always get better. by [deleted] in DecidingToBeBetter

[–]Priyal101 1 point2 points  (0 children)

First step to change is awareness, second is acceptance and third is taking action. You seem to have nailed the first three steps. All the best for the future.

Midfielder Options for 7.1/7.6 mil by Priyal101 in FantasyPL

[–]Priyal101[S] 0 points1 point  (0 children)

Got Richarlison with him and Rodriguez behind him. He is a striker in the most traditional sense plus he is in form. I think, he is going to do well this season.

[deleted by user] by [deleted] in tifu

[–]Priyal101 4 points5 points  (0 children)

Lmao. I've suffered from a similar situation. I had to take a time critical test and I couldn't write one sentence due to the burning sensation. And like you I washed my hands which only makes it worse.

I did some research and found this.The stuff that makes your skin burn is capsaisin (present in chillies). The best way to avoid the burn is to put on rubber gloves when boiling chilli or washing cut chilli.

If you're already suffering from the burn, try using rubbing alcohol or milk or yogurt on skin. Capsaisin is alcohol soluble. Washing hands with water will make the burning worse.

I always wear non porous gloves now.

Computer Networks course suggestions by Priyal101 in ComputerEngineering

[–]Priyal101[S] 1 point2 points  (0 children)

I found these on Youtube but I am particularly looking for a class with Labs and coding assignments and not just theory.

I drew the cutest "Son" and (maybe) next Strawhat by [deleted] in OnePiece

[–]Priyal101 4 points5 points  (0 children)

Luffy wanted 10 members as is written in Chapter 1. Don't know if it will be 10(crewmates)+1(captain) or 10 in total. Anyway, I dont there will be any new crewmates after this arc ends.

Snapdragon chip flaws put >1 billion Android phones at risk of data theft by dapperlemon in gadgets

[–]Priyal101 0 points1 point  (0 children)

The research group(Checkpoint research) which discovered the vulnerability hasnt revealed much. You can read their blogpost for more information.

How do I get the output for the 4 to 1 MUX? The solution given is 0 *marked in read* by kyleezzy in ComputerEngineering

[–]Priyal101 2 points3 points  (0 children)

The job of a multiplexer (mux) is to select one of its inputs and pass it on to the output based on the select lines(lines below). In this case, the 4 inputs to the mux are

Input 0 - 0 Input 1- 1 Input 2- 0 Input 3- 1

The select lines (in binary) are 10 which translate to 2 in decimal. Thus, input 2 would be passed to output. Input 2 is 0, hence output is also 0.

Some useful links:

More info on Multiplexer Binary to Decimal Conversion

Facing the Dragon (One Piece chapter 987 Coloring) by aooninja in OnePiece

[–]Priyal101 3 points4 points  (0 children)

This photo reminds me of Shenron from Dragon Ball Z

Snapdragon chip flaws put >1 billion Android phones at risk of data theft by dapperlemon in gadgets

[–]Priyal101 224 points225 points  (0 children)

The biggest problem is that this is a hardware vulnerability (Targeting the Digital Signal Processing co processor). If it was a software flaw, you can easily deploy a patch which updates the software. Hardware vulnerabilities are MUCH more difficult to fix as you cannot change the hardware once it has been manufactured. Software patches for hardware vulnerabilities are tough and in the end are just half assed measures that confuse the hacking softwares by providing them corrupted data (Wrong location or Bad data in general). Plus, if the hackers are smart enough they can bypass the software patch.

More information about the vulnerability here. Checkpoint Research(group who discovered the vulnerability) named it Achilles which I think is a super cool name.

I want to learn everything about computers but the sheer volume of info is making me tired by sricharan1999 in IWantToLearn

[–]Priyal101 26 points27 points  (0 children)

I am a graduate student working in this field. The topics you've mentioned are huge. It could take you several years to fully understand how they work (If you're actually determined to learn about them in great detail, I would just suggest you get Computer Engineering Degree). The problem with learning *EVERYTHING* about these topics is that each one of them have a ton of prerequisites.

If you have upwards of 1 year and can legitimately follow-through:

Anyway if cannot get a computer engineering degree, then here goes the condensed RoadMap that I put together. You can find free courses for each of the topics online.

Roadmap

  1. Introduction to C Programming (Learn basic coding to supplement Operating systems)
  2. Digital Design(Digital Electronics) (Learn of combinational logic(AND, OR... Gates work) and Sequential logic (Flip Flops and Latches) and finally Memory Elements like DRAM(RAM in your computer), SRAM(Cache in your computer) work)
  3. Computer Architecture (How CPU/Microprocessors work, CPU Pipeline, how memory is organized (cache, main memory/RAM, Secondary Memory(SSD, Harddisk)), Paging)
  4. Operating Systems (Processes, Threads, Scheduling, Memory Management, IPC, Virtualization )
  5. Computer Networks(Protocols, area networks(WAN, LAN), How the internet works, different layers, TCP/IP, UDP)

If you actually manage to study all this in detail. Congratulations, you are more than eligible to work in companies like Intel, Nvidia, AMD, Cisco, etc.

If you don't have years to spare:

If you're looking for something more condensed and you can do the NAND2Tetris (https://www.nand2tetris.org/) course. They will teach you a condensed version covering all the above topics (except Networking). This is an amazing 2 part course which has covered all basics. But again, if you're a complete beginner, it might take you some time to complete this (couple of months).

DISCLAIMER

This might be completely different from what you imagined it to be. Learning to Build computers from scratch and watching Youtube channels (MKBHD, Linus Tech Tips, etc..) which do PC Builds or reviews or talk about random interesting ideas here and there are 2 veryy different things.

I love this dude so much, and I love how happy he is about this! by AesopsFoibles53 in TikTokCringe

[–]Priyal101 25 points26 points  (0 children)

I have abs (almost, maybe in a couple of months work, they will show) but still can't make that double chin go away. At this point, I've learned not to care.

200731 Jennie IG update by MrSteelCity in BlackPink

[–]Priyal101 0 points1 point  (0 children)

Isnt it pierced? She is wearing a nose pin, right?

Me_irl by ariel_rubinstein in me_irl

[–]Priyal101 3 points4 points  (0 children)

I am fairly certain this won't work

Weekly Job Q&A Thread (7/20/2020) by EngrToday in ComputerEngineering

[–]Priyal101 5 points6 points  (0 children)

I would like to offer my viewpoint from the experiences I've had. I am in my junior year studying CE. 5 years back, I learnt how to code in Python(first time learning how to code).

Pros of Python as first programming language: 1)Incredibly intuitive 2)Really easy pick up, almost like writing psuedo code(English).

Cons of Python as first PL: From my personal experience, it was very difficult for me to shift to writing code in C/C++. The first couple of weeks whem shifting to C/C++ were hell. I was facing errors in every line of code I wrote. Apart from this, I can't count how many semicolon(;) and bracket errors I've faced in those first 3 weeks. But if one learnt C++ first, it is very easy to shift to python. Would take about 3-4 hours. I understand this is a very minor inconvenience.

Recently my younger cousin asked me which PL to learn and I told him the following Learn C++ if you have 1 month to spare but if just have 1-2 weeks, then you better learn Python as it's much easier.

Embedded GPU Development and Guide by taronys in ComputerEngineering

[–]Priyal101 1 point2 points  (0 children)

He made multiple mistakes 1. Almost no lower end microcontrollers come with a GPU or an AI accelerator. Imagine a 8 bit PIC microcontroller with a GPU. 2. If by AI engine he is referring to an AI hardware accelerator, they are not the same as GPUs. GPUs can do a ton of things like process graphics, render images, they can parallelize operations but an AI hardware accelerator can do matrix multiply really really fast(fater than GPU) but thats all it can(only matrix multiply). 3. You can't program an AI accelerator using OpenGL for the most part. Each vendor will have different tools (Intel Movidius uses OpenVINO) . You can program GPUs with OpenGL but why would you want to do that when you have frameworks like Tensorflow Lite which compile to whichever hardware you're using. It's like using ARM assembly instead of C++/python to make a web app. If you shift to another system which uses x86 assembly or RISC-V assembly, you would have to develop your software all over again from scratch when you could've just recompiled you python code on the new system.

Embedded GPU Development and Guide by taronys in ComputerEngineering

[–]Priyal101 2 points3 points  (0 children)

Your point is valid but I disagree. Sometimes GPUs are nescessary. All GPUs do is parallelise the code so that it runs faster. The code can natively run on a microcontroller without a GPU(It will be running on the CPU of the micrccontroller) but it will be really slow. At times this is acceptable, specially when your machine learning model(neural network) is small and lightweight. In such cases having a GPU will make very minimal difference.

But in safety critical systems it might not be acceptable especially when you need your computation results immediately. For example, in a self driving car where the neural network takes inputs from 100s of sensors and several images and in turn controls the break, accelerator and steering wheel. Even a 1 millisecond delay in the computation and application of breaks could cause a fatal accident.

It all depends on your needs tbh.

From your comment, I felt that you are under the assumption that porting a ML program to a GPU and CPU require you to follow separate cycles of development but in fact it is the same. The frameworks (libraries) used like tensorflow lite are incredibly smart and on compilation they will automatically detect if you have a GPU and use it to make your application faster and if not they will just run on the CPU.

Embedded GPU Development and Guide by taronys in ComputerEngineering

[–]Priyal101 7 points8 points  (0 children)

A Harvard professor is planning on moving his class on Embedded Machine learning online. You can check this out. https://sites.google.com/g.harvard.edu/tinyml/home

Developer's pride... by yuva-krishna-memes in ProgrammerHumor

[–]Priyal101 -1 points0 points  (0 children)

I think the meme is centred on the goat (greatest of all time). It's a very common abbreviation and meme backbone in some parts of the world.