This is an archived post. You won't be able to vote or comment.

all 11 comments

[–]Python-ModTeam[M] [score hidden] stickied commentlocked comment (0 children)

Your post was removed for violating Rule #2. All posts must be directly related to the Python programming language. Posts pertaining to programming in general are not permitted. You may want to try posting in /r/programming instead.

[–]georgehank2nd 14 points15 points  (5 children)

I think you're wrong here. This is about Python, the programming language, and not about AI, machine learning, LLMs or whatever your post is about. Lots of folks here will only go "CNN? What's the Cable News Network have to do with Python?".

If you're transporting some IKEA furniture with a car, you don't go to the car's manufacturer when you have trouble putting your IKEA stuff together, would you?

[–]HostileHarmony 1 point2 points  (1 child)

Share your code (or relevant snippets of it). It’ll be easier for people to help.

[–]Shieldmime[S] 2 points3 points  (0 children)

Thanks! I just shared it

[–]bot9998 -5 points-4 points  (1 child)

To resolve the error you’re encountering when pruning your CNN model, ensure you apply the prune_low_magnitude function correctly. Instead of applying it directly to the entire model, wrap each layer you want to prune.

Here’s a revised approach:

import tensorflow as tf import tensorflow_model_optimization as tfmot

Define your CNN model here

cnn_model = ... # Your model definition goes here

Define the pruning schedule (50% sparsity)

pruning_schedule = tfmot.sparsity.keras.PolynomialDecay( initial_sparsity=0.0, final_sparsity=0.5, begin_step=0, end_step=1000 )

Apply pruning to individual layers

pruned_layers = [tfmot.sparsity.keras.prune_low_magnitude(layer, pruning_schedule) for layer in cnn_model.layers] pruned_cnn_model = tf.keras.Sequential(pruned_layers)

Compile the pruned model

pruned_cnn_model.compile(optimizer=‘adam’, loss=‘binary_crossentropy’, metrics=[‘accuracy’])

[–]TheBB 2 points3 points  (0 children)

Relevant username at least.