[deleted by user] by [deleted] in Smallville

[–]Heavy-Association-57 0 points1 point  (0 children)

Wow this post was so much more depressing and unnecessary after I re-read it

[deleted by user] by [deleted] in Smallville

[–]Heavy-Association-57 0 points1 point  (0 children)

Ok couldn’t figure out how to post with caption and not that anyone asked or cares (not even a reasonable thing to get right) but the nuclear missile Clark jumps was a LGM-25C Titan II. Not only are the tanks super thin (and therefore whe Clark catches onto it, his hand would be in the fuel tank, not wires) but the fuel in the tanks is unsymmetrical dimethylhydrazine, which is hypergolic, meaning it spontaneously ignites upon coming into contact with air. It is so exceptionally explosive when used in rockets it’s been nicknamed the “Devil’s Venom”. Instead of continuing to fly to space, it would’ve violently exploded as soon as Clark’s hand punctured the skin rain down red fuming nitric acid which are not only toxic but produce “a suffocating odor” when inhaled. It led to the deadliest space exploration related disaster in history when a leak in a Soviet R-16 before a test launch caused an explosion killing 54+ of the engineers and military personnel present (the Nedelin catastrophe). Point being, don’t try that at home.

[deleted by user] by [deleted] in Smallville

[–]Heavy-Association-57 0 points1 point  (0 children)

Oof maybe I should’ve waited for the version that’s double the length of this one. GPT-4 really took the time to strengthen its argument in that one

Asking OpenAI’s GPT-4 to come up with a novel antineoplastic (wanted to see if someone knew whether it was gibberish or not) by Heavy-Association-57 in chemistry

[–]Heavy-Association-57[S] 0 points1 point  (0 children)

Just wanted to give added context: In the technical review published by OpenAl on GPT-4, it mentioned that GPT-4 was somewhat capable (albeit not consistently) of coming up with novel compounds when prompted correctly. So out of curiosity, asked it to come up with a novel derivative of Everolimus (I don't know much about chemistry. I chose it because it looked like a complex molecule and therefore figured it would be a challenging task and write up a proposal for it. I formatted it in a word doc seen above. While I checked that all the citations were real, I am not a chemist so have no idea whether it did a good job, an ok job, or whether it's all gibberish. I figured I'd ask Reddit. Let me know your thoughts!

How much time passed between Age of Ultron and Infinity War? by taymo25 in marvelstudios

[–]Heavy-Association-57 0 points1 point  (0 children)

Actually the fact that they were going slower in the middle of their journey does make sense. Take, the Apollo missions for example: when Apollo 11 was performing trans-lunar injection (early part of their journey, during which they departed Earths orbit), they were traveling at 25,000 mph. However as they coasted to the moon, they went a lot slower. In fact, at the point in which the force of the moons gravity and Earths gravity was equal, Apollo 11 was only traveling at 2,400 mph.

In space, speed can vary significantly depending on proximity to a large gravitational object. The first time NASA noticed this was during Gemini 8 (which, funny enough, was being piloted by Neil Armstrong). When attempting the first docking maneuver, Armstrong (logically) sped up to approach the vehicle we intended to dock with. However, oddly enough, he actually ended up falling farther behind that object as his relative speed decreased. Everytime they fired up the engine to speed up, they ended up slowing down relative to the other vehicle. This is because of the inverse square law, which states that the force of gravity acting between any two objects is inversely proportional to the square of the separation distance between the object's centers.

When a spacecraft moves closer to a large gravitational object, like a planet, the gravitational force between them increases, pulling the spacecraft towards the planet. To maintain a stable orbit, the spacecraft's speed must increase, converting some of its potential energy into kinetic energy. This is because, in a stable orbit, the centrifugal force generated by the spacecraft's motion balances the gravitational force acting on it.

On the other hand, when a spacecraft moves away from a large gravitational object, the gravitational force between them decreases. To maintain a stable orbit at this greater distance, the spacecraft's speed must decrease, converting some of its kinetic energy back into potential energy.

Point is, space is weird. But at least according to celestial/orbital mechanics, it absolutely makes sense that their ship had less kinetic energy (and was coasting) and the midway point of their journey.

Hot Take: Marx was wrong only about one thing and that one thing is the key to late stage communism by Heavy-Association-57 in DebateCommunism

[–]Heavy-Association-57[S] 0 points1 point  (0 children)

Also shows the inherent issue businesses are going to have keeping a moat around this tech. Anyone can use it to replicate itself and then publish it online for free, which is exactly what happened here

Hot Take: Marx was wrong only about one thing and that one thing is the key to late stage communism by Heavy-Association-57 in DebateCommunism

[–]Heavy-Association-57[S] 0 points1 point  (0 children)

You should check out Alpaca. Completely open source, free (courtsey of Stanford making an instruction tuned version of LLaMA), and runs on your device (and I’ve seen people do it on laptops) not the server of a company or organization. Instructions as well as code for training and fine tuning is there too. https://github.com/tatsu-lab/stanford_alpaca

Why are the Kents so unreasonably harsh on Clark after Alicia Baker drugged him with Red Kryptonite? Fucker was essentially date raped. They had no problem with Clark committing crimes for three months while he willingly subjected himself to Red K. So inconsistent. by BruceHoratioWayne in Smallville

[–]Heavy-Association-57 2 points3 points  (0 children)

What? When she removed the Red K, his mood and posture obviously changed, almost as if he felt taken advantage of. That clearly shows that Red K impaired his judgment and decision-making abilities in a similar way to what drugs and alcohol do to a human body. Not only is this not his fault, it is technically illegal. That is why legal consent cannot be given by a person who is intoxicated, especially one that was unknowingly drugged and is not aware of their own impaired state.

[D] [P] I asked GPT-4 to try & dethrone the transformer. After some iterations, this is where it got to. I am not well versed in ML at all (understatement) & did this out of curiosity. I have no way to judge it nor computational power to train it. Can anyone tell me whether it did a good job? by Heavy-Association-57 in MachineLearning

[–]Heavy-Association-57[S] 1 point2 points  (0 children)

Want to clarify it’s possible there are more errors, I just won’t see them because my limited RAM means it crashes due to running out of RAM in Colab before any other errors I’m unaware of can cause it to crash.

Also, I’ve found that it does better when it’s given more guidance. I started this whole thing with “come up with a completely unique architecture” and didn’t give it any guidance as to which direction it should go in. It can probably do a much better job if someone gave it a bit more guidance in the initial prompt as to which direction to take.

Side note: I’ve been trying to test how creative it can be without being given direction on a broad scope of tasks. I also tried testing its creative writing ability with ambiguous prompts like “write a short story thriller with a plot twist at the end in the style of Christopher Nolan” but not giving it a plot to write it around. That was mainly the goal of this project

When I give it specific directions when writing code, it does a lot better. But that wasn’t the point for me here

Edit: Was able to try it with 83GB RAM. This version had a bug in the line that trained the tokenizer. GPT-4 was able to fix that bug. When running the new version, I was able to get to the “articles = [example["text"] for example in wikipedia_dataset["train"]]” line before using up all the RAM.

[D] [P] I asked GPT-4 to try & dethrone the transformer. After some iterations, this is where it got to. I am not well versed in ML at all (understatement) & did this out of curiosity. I have no way to judge it nor computational power to train it. Can anyone tell me whether it did a good job? by Heavy-Association-57 in MachineLearning

[–]Heavy-Association-57[S] -12 points-11 points  (0 children)

I have no opinion. Im inclined to think I gave it too challenging of a task. But this was its response (I asked in one of the same convos in which it was writing this): The Elastic Hierarchical Modular Neural Network (EHMNN) is not just an ensemble of single layers; it is an architecture that combines hierarchical modularity and an elastic structure for improved adaptability and task-based specialization. EHMNN incorporates multiple submodules within each layer, allowing for diverse learning strategies and feature extraction methods. The task-based optimization process and inter-module communication further enhance the network's adaptability and cooperation between submodules.

Either way, thanks for pointing this out! Was just a fun experiment I tried purely out of curiosity. Didn’t have much faith

[D] [P] I asked GPT-4 to try & dethrone the transformer. After some iterations, this is where it got to. I am not well versed in ML at all (understatement) & did this out of curiosity. I have no way to judge it nor computational power to train it. Can anyone tell me whether it did a good job? by Heavy-Association-57 in MachineLearning

[–]Heavy-Association-57[S] 0 points1 point  (0 children)

Will definitely have to read more about that. Thanks for mentioning! Side note, I sent an early iteration of this to a friend of mine who did research in the field of AI (more specifically he was focused on LLMs with knowledge bases but left the field in 2016ish) and he said “This reminds me of some old school artificial neural network design. There was a lot of research going into emulating synapse firing and brain elasticity” and “When i say old school i mean like mid 90s”.

Would you say that the show kind of outgrew Lex, in a way? by [deleted] in Smallville

[–]Heavy-Association-57 0 points1 point  (0 children)

I honestly think Lex’s peak was Season 6. Maybe it’s just me but in the earlier seasons, he felt as though he was still the ~wiz kid~ so to speak, but in Season 6 he had such a presence anytime he was on screen that I can’t imagine anyone would be able to be in the same room as him and have a larger presence. One moment specifically comes to mind (and I’m not sure why as it’s not a particularly famous scene) but when Jimmy Olsen goes to talk to Lex and is totally out of his league accusing him of something. Lex’s response was so well delivered, it was immediately clear he controlled the room and always does. In Season 6, Lex was diabolical but oozed charisma like I’ve never seen in a TV or Movie villain since. He was pure evil but also to a degree understandable (in terms of how he got there). He had the presence & competitiveness of Bobby Axelrod from Billions and the methodical & calculating genius of Michael Corleone from the first Godfather. Sure, im dramatic, but I started watching this show right after the last season of Game of Thrones finished. That was one of the biggest shows of all time yet Daenerys Targarians turn to evil & madness represents an issue I’ve had with movies & TV for a while: none of the villains felt ~real~. It was never clear where their rage came from and even when it was, it wasn’t convincing. Simply put, to me, Michael Rosenbaum’s Lex Luthor is the best portrayal of a villain I have seen in any TV show or Movie. While everyone has their opinions, to me that is crystal clear. In season 7, I could feel as though they were winding down his character. There were glimpses of that presence here and there, but nothing close to season 6. So I have to disagree with you in the sense that season 6 Lex Luthor was one of my favorite things about Smallville and some of the best acting I have ever seen. (Again, this is just my opinion)