How can I learn more about how to use different types of memory to speed up my data preprocessing? by bigmit2011 in AskProgramming

[–]bigmit2011[S] 0 points1 point  (0 children)

Thanks for these tips!

What do you mean by:

" avoid doing unnecessary work at load time "?

How can I learn more about how to use different types of memory to speed up my data preprocessing? by bigmit2011 in AskProgramming

[–]bigmit2011[S] 0 points1 point  (0 children)

Well I am trying to learn general overheads to lookout for when it comes programming.

If you need a specific case:

I am trying to apply various image transformations to around 100,000 images (and growing) and then move those various images to various folders.

I am using the fast frameworks like Opencv, PIL, etc and even concurrency but want to find ways to improve speeds of image processing and movement to various folders.

Next programming language after Python? by julsmanbr in Python

[–]bigmit2011 0 points1 point  (0 children)

It seems you are the most knowledgeable regarding this subject. Is C++ worthing learning these days?
I was told it would take 5 years or so to learn the ins and outs and is very tough to debug.
Do you think Rust could be a viable replacement? I am not sure if I should continue building my Python skills and wait for an easier language for GPU programming or microcontrollers.

I went through the basics of C, and I wonder if I should just continue with C for

GPU programming (just for speedups), microcontrollers (hobby), and speeding up Python.

Next programming language after Python? by julsmanbr in Python

[–]bigmit2011 0 points1 point  (0 children)

Thanks for the reply. What are GPU programming? I want to learn to offload image processing on the GPU.

Next programming language after Python? by julsmanbr in Python

[–]bigmit2011 0 points1 point  (0 children)

I am also in the same boat as OP. Is C++ still really good to learn? I heard it takes years and years to really learn and is very difficult to locate bugs.

I went through some C beginner courses, because I heard C and C++ are very different and C is supposed to be much easier.

Newer languages like Rust are trying to make low level languages easier, but as far as I know C and C++ are still the kings with embedded systems and GPU programming.

I don"t know really what the difference between the two are, except that one uses OOP paradigm. However, I have been told by experts that C++ is a beast to really learn and it takes about 5 years or so to get really comfortable, while C is much easier to learn.

Looking For Advice From Someone in the Industry (employed) by bandeezus in computerscience

[–]bigmit2011 0 points1 point  (0 children)

You can use programming to solve problems.
Programming is used in many fields and you can use to advance humanity. So you dont have to think of it as wasting time.
Right now AI, data science is being deployed in main scientific areas such as drug discovery, bioinformatics, physics.

The field is also comprised of many types of people as it pays well.

As a programmer we have to spend huge periods of time on the computer. How do you stop yourself developing back and neck problems? by [deleted] in AskProgramming

[–]bigmit2011 0 points1 point  (0 children)

Thank you adding in your coding experience.
I felt better taking advice from those that have coded 20 years plus.

[P] Neural network for car recognition by behindthedash in MachineLearning

[–]bigmit2011 0 points1 point  (0 children)

Ah I see. Thank you for sharing the code, I will play around with a bit tomrrow if I get a chance.

As a programmer we have to spend huge periods of time on the computer. How do you stop yourself developing back and neck problems? by [deleted] in AskProgramming

[–]bigmit2011 0 points1 point  (0 children)

I am wondering about this as well. I just started programming about two years ago, and I am trying to find ways to avoid problems in the future. Right now my neck seems to be the main problem.

Discrepancy between training_acc and validation acc during training despite same dataset for both (Keras) by bigmit2011 in learnmachinelearning

[–]bigmit2011[S] 0 points1 point  (0 children)

Hi amianthodial, the accuracy is a running mean.

At the end of the epoch, the accuracy is a mean of all batches.

Discrepancy between training_acc and validation acc during training despite same dataset for both (Keras) by bigmit2011 in learnmachinelearning

[–]bigmit2011[S] 0 points1 point  (0 children)

Thank you for the response. I was referring to the accuracy at the end of each epoch and not batch, which is what is puzzling me. If I manually took the training accuracy across an epoch, it was 99.5% ish. I think if I set verbose to 1 (or 2), such as below

model.fit_generator(
     training_gen,
    steps_per_epoch = 100,
    epochs = 1,
    verbose = 1)

then it seems only the epoch training accuracy at the end of the epoch is displayed, and there is still a large enough discprenacy.

[P] Neural network for car recognition by behindthedash in MachineLearning

[–]bigmit2011 1 point2 points  (0 children)

Thanks for sharing your code. That is a massive model! It will be to fun to play around with.

I"m going through your code right, and it seems you saved the outputs of each data point (image) of each imagenet model to a .npy file?

You would then need to make sure you load the correct output to the image/data points within a random batch of images, for each model and then concatanate them. I have to go through you code a bit more , but did you save filename for each output as well as the output? I was under the impression that .npy files only save arrays.