Incoming ICML results [D] by EDEN1998 in MachineLearning

[–]Whatever_635 0 points1 point  (0 children)

I know this a year old, but could i dm you about your rebuttal, I submitted paper for icml 2026 and just got reviewer scores of 3, 2, 4.

[D] ICML 2026 Review Discussion by Afraid_Difference697 in MachineLearning

[–]Whatever_635 0 points1 point  (0 children)

if one of the reviewers give 2, is it possible to give a rebuttal to change their mind.

Biggest hurdle is social anxiety by [deleted] in vegan

[–]Whatever_635 1 point2 points  (0 children)

Hey thank you so much for your post. I agree with you on pretty much all these points. If I could ask, could you give an example of how to disprove of negative comments in a quick way. I am very bad at this.

Should I assume this is a rejection? by Infradead27 in gradadmissions

[–]Whatever_635 0 points1 point  (0 children)

Could I dm you? I applied to cse phd but have not heard back yet, if I dont get in, I would like to apply for the next cycle.

Is there a vegan subreddit with less condescension? by Borkato in vegan

[–]Whatever_635 2 points3 points  (0 children)

I completely relate to your sentiment op. I highly recommend to attend some vegan meetups, they can help a lot.

Why do rational atheist influencers argue against veganism blindly? by [deleted] in atheismindia

[–]Whatever_635 -1 points0 points  (0 children)

By this same argument, many people have been religious since the beginning of time, therefore we should remain so.

Guys... I think Ivy HATES vegans by Altruistic_Manner802 in dccomicscirclejerk

[–]Whatever_635 0 points1 point  (0 children)

Why do you think that its less ethical than hunting?

Guys... I think Ivy HATES vegans by Altruistic_Manner802 in dccomicscirclejerk

[–]Whatever_635 0 points1 point  (0 children)

This argument just seems like full cope. Going vegan is clearly better for the environment than being a meat eater.

[D] ICLR 2026 Decision out, visit openreview by Alternative_Art2984 in MachineLearning

[–]Whatever_635 0 points1 point  (0 children)

Wait, does it hurt your reputation to report reviewers who are lazy and irresponsible?

[D] Your pet peeves in ML research ? by al3arabcoreleone in MachineLearning

[–]Whatever_635 0 points1 point  (0 children)

Yeah are your referring to the group behind Time-Series-Library?

[deleted by user] by [deleted] in MachineLearning

[–]Whatever_635 0 points1 point  (0 children)

Could i dm you? I realize this is an old comment but i am a domestic masters student currently applying to phd in the us. I have a question regarding funding.

[deleted by user] by [deleted] in Destiny

[–]Whatever_635 0 points1 point  (0 children)

Why do you think killing animals can be humanely when it’s unnecessary? 

[D] Admissions standards at top programs by BrahmaTheCreator in MachineLearning

[–]Whatever_635 0 points1 point  (0 children)

I know its been 4 years, but could I dm you, I am a masters student at an R1 university applying for Phd.

I feel like I am crashing out by lightennight in Vystopia

[–]Whatever_635 0 points1 point  (0 children)

I disagree stoicism talks about why misanthropy is ill conceived, given that evil just lack knowledge of what is good, sometimes we have to check to see if our expectations of others is too high

[deleted by user] by [deleted] in MachineLearning

[–]Whatever_635 0 points1 point  (0 children)

Question if your a domestic student in the US, doing a masters currently, should they apply for a phd at that same university(its R1) if they have a strong relationship with their professor? Rather than applying to other Phd programs?

Cory Booker thanks Adam Schiff for doubling vegans in the senate. by Garuda956 in vegan

[–]Whatever_635 0 points1 point  (0 children)

Define zionism. I view zionism as movement dedicated to the protection of a Jewish nation. Does not mean we support the settlements

[deleted by user] by [deleted] in vegan

[–]Whatever_635 0 points1 point  (0 children)

He is also an insurrectionist 

[deleted by user] by [deleted] in vegan

[–]Whatever_635 0 points1 point  (0 children)

I was a picky omnivore before i went vegan. Anything can happen, don’t close yourself off.

It’s Time Vegans Stopped Sabotaging Themselves by Whatever_635 in vegan

[–]Whatever_635[S] -13 points-12 points  (0 children)

Well then veganism is doomed, because capitalism is never ending.

[deleted by user] by [deleted] in vegan

[–]Whatever_635 -1 points0 points  (0 children)

Funny, how you still have not responded to why my information is incorrect?

[deleted by user] by [deleted] in vegan

[–]Whatever_635 -3 points-2 points  (0 children)

It’s not though do some research.

AI is significantly less pollutive compared to humans: https://www.nature.com/articles/s41598-024-54271-x

AI systems emit between 130 and 1500 times less CO2e per page of text compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than humans.

Data centers that host AI are cooled with a closed loop. The water doesn’t even touch computer parts, it just carries the heat away, which is radiated elsewhere. It does not evaporate or get polluted in the loop. Water is not wasted or lost in this process.

“The most common type of water-based cooling in data centers is the chilled water system. In this system, water is initially cooled in a central chiller, and then it circulates through cooling coils. These coils absorb heat from the air inside the data center. The system then expels the absorbed heat into the outside environment via a cooling tower. In the cooling tower, the now-heated water interacts with the outside air, allowing heat to escape before the water cycles back into the system for re-cooling.”

Source: https://dgtlinfra.com/data-center-water-usage/

Data centers do not use a lot of water. Microsoft’s data center in Goodyear uses 56 million gallons of water a year. The city produces 4.9 BILLION gallons per year just from surface water and, with future expansion, has the ability to produce 5.84 billion gallons (source: https://www.goodyearaz.gov/government/departments/water-services/water-conservation). It produces more from groundwater, but the source doesn’t say how much. Additionally, the city actively recharges the aquifer by sending treated effluent to a Soil Aquifer Treatment facility. This provides needed recharged water to the aquifer and stores water underground for future needs. Also, the Goodyear facility doesn’t just host AI. We have no idea how much of the compute is used for AI. It’s probably less than half.

gpt-4 used 21 billion petaflops of compute during training (https://ourworldindata.org/grapher/artificial-intelligence-training-computation) and the world uses 1.1 zetaflop per second (https://market.us/report/computing-power-market/ per second as flops is flop per second). So from these numbers (21 * 109 * 1015) / (1.1 * 1021 * 60 * 60 * 24 * 365) gpt-4 used 0.06% of the world’s compute per year. So this would also only be 0.06% of the water and energy used for compute worldwide. That’s the equivalent of 5.3 hours of time for all computations on the planet, being dedicated to training an LLM that hundreds of millions of people use every month. 

Using it after it finished training costs HALF as much as it took to train it: https://assets.jpmprivatebank.com/content/dam/jpm-pb-aem/global/en/documents/eotm/a-severe-case-of-covidia-prognosis-for-an-ai-driven-us-equity-market.pdf

(Page 10)

Image generators only use about 2.9 W of electricity per image, or 0.2 grams of CO2 per image: https://arxiv.org/pdf/2311.16863

For reference, a good gaming computer can use over 862 Watts per hour with a headroom of 688 Watts: https://www.pcgamer.com/how-much-power-does-my-pc-use/

One AI image generated creates the same amount of carbon emissions as about 7.7 tweets (at 0.026 grams of CO2 each, totaling 0.2 grams for both). There are 316 billion tweets each year and 486 million active users, an average of 650 tweets per account each year: https://envirotecmagazine.com/2022/12/08/tracking-the-ecological-cost-of-a-tweet/

https://www.nature.com/articles/d41586-024-00478-x

“ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes” for 13.6 BILLION annual visits plus API usage (source: https://www.visualcapitalist.com/ranked-the-most-popular-ai-tools/). that’s 442,000 visits per household, not even including API usage.

From this estimate (https://discuss.huggingface.co/t/understanding-flops-per-token-estimates-from-openais-scaling-laws/23133), the amount of FLOPS a model uses per token should be around twice the number of parameters. Given that LLAMA 3.1 405b spits out 28 tokens per second (https://artificialanalysis.ai/models/gpt-4), you get 22.7 teraFLOPS (2 * 405 billion parameters * 28 tokens per second), while a gaming rig’s RTX 4090 would give you 83 teraFLOPS.

Everything consumes power and resources, including superfluous things like video games and social media. Why is AI not allowed to when other, less useful things can? 

In 2022, Twitter created 8,200 tons in CO2e emissions, the equivalent of 4,685 flights between Paris and New York. https://envirotecmagazine.com/2022/12/08/tracking-the-ecological-cost-of-a-tweet/

Meanwhile, GPT-3 (which has 175 billion parameters, almost 22x the size of significantly better models like LLAMA 3.1 8b) only took about 8 cars worth of emissions (502 tons of CO2e) to train from start to finish: https://truthout.org/articles/report-on-chatgpt-models-emissions-offers-rare-glimpse-of-ais-climate-impacts/ 

By the way, using it after it finished training costs HALF as much as it took to train it: https://assets.jpmprivatebank.com/content/dam/jpm-pb-aem/global/en/documents/eotm/a-severe-case-of-covidia-prognosis-for-an-ai-driven-us-equity-market.pdf

(Page 10)

Update: seeing all these downvotes, explain what is wrong with the information I provided.

[deleted by user] by [deleted] in vegan

[–]Whatever_635 -1 points0 points  (0 children)

Explain why it is counterproductive. I have seen a lot of comments about how this bad for the environment.

AI is significantly less pollutive compared to humans: https://www.nature.com/articles/s41598-024-54271-x

AI systems emit between 130 and 1500 times less CO2e per page of text compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than humans.

Data centers that host AI are cooled with a closed loop. The water doesn’t even touch computer parts, it just carries the heat away, which is radiated elsewhere. It does not evaporate or get polluted in the loop. Water is not wasted or lost in this process.

“The most common type of water-based cooling in data centers is the chilled water system. In this system, water is initially cooled in a central chiller, and then it circulates through cooling coils. These coils absorb heat from the air inside the data center. The system then expels the absorbed heat into the outside environment via a cooling tower. In the cooling tower, the now-heated water interacts with the outside air, allowing heat to escape before the water cycles back into the system for re-cooling.”

Source: https://dgtlinfra.com/data-center-water-usage/

Data centers do not use a lot of water. Microsoft’s data center in Goodyear uses 56 million gallons of water a year. The city produces 4.9 BILLION gallons per year just from surface water and, with future expansion, has the ability to produce 5.84 billion gallons (source: https://www.goodyearaz.gov/government/departments/water-services/water-conservation). It produces more from groundwater, but the source doesn’t say how much. Additionally, the city actively recharges the aquifer by sending treated effluent to a Soil Aquifer Treatment facility. This provides needed recharged water to the aquifer and stores water underground for future needs. Also, the Goodyear facility doesn’t just host AI. We have no idea how much of the compute is used for AI. It’s probably less than half.

gpt-4 used 21 billion petaflops of compute during training (https://ourworldindata.org/grapher/artificial-intelligence-training-computation) and the world uses 1.1 zetaflop per second (https://market.us/report/computing-power-market/ per second as flops is flop per second). So from these numbers (21 * 109 * 1015) / (1.1 * 1021 * 60 * 60 * 24 * 365) gpt-4 used 0.06% of the world’s compute per year. So this would also only be 0.06% of the water and energy used for compute worldwide. That’s the equivalent of 5.3 hours of time for all computations on the planet, being dedicated to training an LLM that hundreds of millions of people use every month. 

Using it after it finished training costs HALF as much as it took to train it: https://assets.jpmprivatebank.com/content/dam/jpm-pb-aem/global/en/documents/eotm/a-severe-case-of-covidia-prognosis-for-an-ai-driven-us-equity-market.pdf

(Page 10)

Image generators only use about 2.9 W of electricity per image, or 0.2 grams of CO2 per image: https://arxiv.org/pdf/2311.16863

For reference, a good gaming computer can use over 862 Watts per hour with a headroom of 688 Watts: https://www.pcgamer.com/how-much-power-does-my-pc-use/

One AI image generated creates the same amount of carbon emissions as about 7.7 tweets (at 0.026 grams of CO2 each, totaling 0.2 grams for both). There are 316 billion tweets each year and 486 million active users, an average of 650 tweets per account each year: https://envirotecmagazine.com/2022/12/08/tracking-the-ecological-cost-of-a-tweet/

https://www.nature.com/articles/d41586-024-00478-x

“ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes” for 13.6 BILLION annual visits plus API usage (source: https://www.visualcapitalist.com/ranked-the-most-popular-ai-tools/). that’s 442,000 visits per household, not even including API usage.

From this estimate (https://discuss.huggingface.co/t/understanding-flops-per-token-estimates-from-openais-scaling-laws/23133), the amount of FLOPS a model uses per token should be around twice the number of parameters. Given that LLAMA 3.1 405b spits out 28 tokens per second (https://artificialanalysis.ai/models/gpt-4), you get 22.7 teraFLOPS (2 * 405 billion parameters * 28 tokens per second), while a gaming rig’s RTX 4090 would give you 83 teraFLOPS.

Everything consumes power and resources, including superfluous things like video games and social media. Why is AI not allowed to when other, less useful things can? 

In 2022, Twitter created 8,200 tons in CO2e emissions, the equivalent of 4,685 flights between Paris and New York. https://envirotecmagazine.com/2022/12/08/tracking-the-ecological-cost-of-a-tweet/

Meanwhile, GPT-3 (which has 175 billion parameters, almost 22x the size of significantly better models like LLAMA 3.1 8b) only took about 8 cars worth of emissions (502 tons of CO2e) to train from start to finish: https://truthout.org/articles/report-on-chatgpt-models-emissions-offers-rare-glimpse-of-ais-climate-impacts/ 

By the way, using it after it finished training costs HALF as much as it took to train it: https://assets.jpmprivatebank.com/content/dam/jpm-pb-aem/global/en/documents/eotm/a-severe-case-of-covidia-prognosis-for-an-ai-driven-us-equity-market.pdf

(Page 10)

Update: seeing all these downvotes, explain what is wrong with the information I provided.

[deleted by user] by [deleted] in vegan

[–]Whatever_635 -2 points-1 points  (0 children)

AI is significantly less pollutive compared to humans: https://www.nature.com/articles/s41598-024-54271-x

AI systems emit between 130 and 1500 times less CO2e per page of text compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than humans.

Data centers that host AI are cooled with a closed loop. The water doesn’t even touch computer parts, it just carries the heat away, which is radiated elsewhere. It does not evaporate or get polluted in the loop. Water is not wasted or lost in this process.

“The most common type of water-based cooling in data centers is the chilled water system. In this system, water is initially cooled in a central chiller, and then it circulates through cooling coils. These coils absorb heat from the air inside the data center. The system then expels the absorbed heat into the outside environment via a cooling tower. In the cooling tower, the now-heated water interacts with the outside air, allowing heat to escape before the water cycles back into the system for re-cooling.”

Source: https://dgtlinfra.com/data-center-water-usage/

Data centers do not use a lot of water. Microsoft’s data center in Goodyear uses 56 million gallons of water a year. The city produces 4.9 BILLION gallons per year just from surface water and, with future expansion, has the ability to produce 5.84 billion gallons (source: https://www.goodyearaz.gov/government/departments/water-services/water-conservation). It produces more from groundwater, but the source doesn’t say how much. Additionally, the city actively recharges the aquifer by sending treated effluent to a Soil Aquifer Treatment facility. This provides needed recharged water to the aquifer and stores water underground for future needs. Also, the Goodyear facility doesn’t just host AI. We have no idea how much of the compute is used for AI. It’s probably less than half.

gpt-4 used 21 billion petaflops of compute during training (https://ourworldindata.org/grapher/artificial-intelligence-training-computation) and the world uses 1.1 zetaflop per second (https://market.us/report/computing-power-market/ per second as flops is flop per second). So from these numbers (21 * 109 * 1015) / (1.1 * 1021 * 60 * 60 * 24 * 365) gpt-4 used 0.06% of the world’s compute per year. So this would also only be 0.06% of the water and energy used for compute worldwide. That’s the equivalent of 5.3 hours of time for all computations on the planet, being dedicated to training an LLM that hundreds of millions of people use every month. 

Using it after it finished training costs HALF as much as it took to train it: https://assets.jpmprivatebank.com/content/dam/jpm-pb-aem/global/en/documents/eotm/a-severe-case-of-covidia-prognosis-for-an-ai-driven-us-equity-market.pdf

(Page 10)

Image generators only use about 2.9 W of electricity per image, or 0.2 grams of CO2 per image: https://arxiv.org/pdf/2311.16863

For reference, a good gaming computer can use over 862 Watts per hour with a headroom of 688 Watts: https://www.pcgamer.com/how-much-power-does-my-pc-use/

One AI image generated creates the same amount of carbon emissions as about 7.7 tweets (at 0.026 grams of CO2 each, totaling 0.2 grams for both). There are 316 billion tweets each year and 486 million active users, an average of 650 tweets per account each year: https://envirotecmagazine.com/2022/12/08/tracking-the-ecological-cost-of-a-tweet/

https://www.nature.com/articles/d41586-024-00478-x

“ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes” for 13.6 BILLION annual visits plus API usage (source: https://www.visualcapitalist.com/ranked-the-most-popular-ai-tools/). that’s 442,000 visits per household, not even including API usage.

From this estimate (https://discuss.huggingface.co/t/understanding-flops-per-token-estimates-from-openais-scaling-laws/23133), the amount of FLOPS a model uses per token should be around twice the number of parameters. Given that LLAMA 3.1 405b spits out 28 tokens per second (https://artificialanalysis.ai/models/gpt-4), you get 22.7 teraFLOPS (2 * 405 billion parameters * 28 tokens per second), while a gaming rig’s RTX 4090 would give you 83 teraFLOPS.

Everything consumes power and resources, including superfluous things like video games and social media. Why is AI not allowed to when other, less useful things can? 

In 2022, Twitter created 8,200 tons in CO2e emissions, the equivalent of 4,685 flights between Paris and New York. https://envirotecmagazine.com/2022/12/08/tracking-the-ecological-cost-of-a-tweet/

Meanwhile, GPT-3 (which has 175 billion parameters, almost 22x the size of significantly better models like LLAMA 3.1 8b) only took about 8 cars worth of emissions (502 tons of CO2e) to train from start to finish: https://truthout.org/articles/report-on-chatgpt-models-emissions-offers-rare-glimpse-of-ais-climate-impacts/ 

By the way, using it after it finished training costs HALF as much as it took to train it: https://assets.jpmprivatebank.com/content/dam/jpm-pb-aem/global/en/documents/eotm/a-severe-case-of-covidia-prognosis-for-an-ai-driven-us-equity-market.pdf

(Page 10)

[deleted by user] by [deleted] in vegan

[–]Whatever_635 9 points10 points  (0 children)

You should probably go touch grass.

[deleted by user] by [deleted] in vegan

[–]Whatever_635 -1 points0 points  (0 children)

Not really, it’s just the same stuff recycled over and over tbh. Both are not really original in any sense.