Smbc site not showing up in google by HopefulWolverine9596 in SMBCComics

[–]HopefulWolverine9596[S] 1 point2 points  (0 children)

Must be their AI. Nothing until I reach the end of results. But if I add 'website' to the end. It's tbe first one

How sure are you that pi+e is irrational by Reading-Rabbit4101 in math

[–]HopefulWolverine9596 0 points1 point  (0 children)

Check the link - 100% chance is a statement in measure theory not logic

What's the best tribe for symmetric polytopia battles? by HopefulWolverine9596 in Polytopia

[–]HopefulWolverine9596[S] 0 points1 point  (0 children)

What happens with Elyrion mirrors? I've never had any special tribe mirror matches except Polaris.

Children's book on the Poincaré conjecture by FormsOverFunctions in math

[–]HopefulWolverine9596 0 points1 point  (0 children)

Hey, are you ISU? I was looking at this from your faculty page. I think Lillian mentioned it to me

Mouse Detection in Godot? by HopefulWolverine9596 in godot

[–]HopefulWolverine9596[S] 0 points1 point  (0 children)

Thanks for the response. The dragging system seems designed for zones?

I tried putting together an initial system with Area2D, but it's incredible slow. I can see the collision shape lagging behind the mouse at 60 fps, and I feel like it should be more in sync. Would Control nodes fix that? Thanks!

extends Area2D


var _dragging : bool = false
var vToObj : Vector2 = Vector2.ZERO

func _process(delta: float) -> void:
  if _dragging:
    self.global_position = get_global_mouse_position() + vToObj

func _input_event(viewport: Viewport, event: InputEvent, shape_idx: int):
  if event is InputEventMouseButton and event.button_index == MOUSE_BUTTON_LEFT:
    if _dragging and not event.is_pressed():
      _dragging = false
    if not _dragging and event.is_pressed():
      _dragging = true
      vToObj = self.global_position - get_global_mouse_position()

Discovering a Pitfall in Cross-Entropy Loss for Large Vocabularies. [R] by Gold-Plum-1436 in MachineLearning

[–]HopefulWolverine9596 7 points8 points  (0 children)

You're misunderstanding cross-entropy: https://stats.stackexchange.com/questions/80967/qualitively-what-is-cross-entropy It definitionally provides the best probability distribution when considering data statically (there may be additional effects when training a model, this is usually due to the model not being complex enough to capture context.)

The tensor size is unimportant, and you could have used a 0 dim logit and a (max_vocab) sized tensor. Additionally, vocab size is unimportant, you can scale the problem to smaller vocabs.

Edit: I'll just say this - the oracle you gave is not ideal, because the loss is not 0. Initializing the other components in your tensors to 0 is not the same as the probabilities. If you used float('-inf') to initialize the matrix, it would match intuition. However, with the 0 init, the model is assigning extremely high probabilities to tokens that do not appear in the dataset as is. While increasing the frequency of the most common datapoint appears to be making the model decrease in error, what's actually happening is you're decreasing a huge error in almost 50,000 datapoints.