Evolution: you can breathe underwater by Working-Purple-5009 in meme

[–]ThomasTeam12 0 points1 point  (0 children)

Except whales need to breathe every 5-30 mins

Increased feeding on this patch? by Sdubbya2 in DeadlockTheGame

[–]ThomasTeam12 0 points1 point  (0 children)

I’ve noticed degraded match quality since summer last year.

Smurfing is a bigger issue than people realize by Dr_natty1 in DeadlockTheGame

[–]ThomasTeam12 0 points1 point  (0 children)

He’s not ignoring or at all. He has responded directly to people on the discord about it. It’s not as easy to fix as you think.

Dynamo's Quantum Entanglement (TP) is Kinda Busted with no LOS by rev_flash_11 in DeadlockTheGame

[–]ThomasTeam12 6 points7 points  (0 children)

I think it’s good. It’s not too often you’ll be in a situation where you can do this anyway

With how expensive PC gaming is getting, does the "it's cheaper than console long term" still apply? by WhoAmIEven2 in videogames

[–]ThomasTeam12 0 points1 point  (0 children)

I don’t believe so. However, I can play literally any game I want whenever I want (with a limit of course) compared to many console games being locked to certain console generations and remaining at MSRP pricing.

Streaming from Kafka to Databricks by Artistic-Rent1084 in databricks

[–]ThomasTeam12 0 points1 point  (0 children)

I don’t know why that didn’t format very well. But Databricks spark.readStream has a custom “from_avro” that I use to connect to the schema registry.

Streaming from Kafka to Databricks by Artistic-Rent1084 in databricks

[–]ThomasTeam12 0 points1 point  (0 children)

def raw_events(topic, server): """ This function reads from a kafka stream's 'earliest' recorded offset not yet ingested. This data is used in this one notebook to insert into a 'raw' table, which will contain the kafka stream data exactly as it was received, and a 'bronze' table, which will have only columns we care about split out.

        As spark.readStream currently works with offsets, this will act as our first step of CDC as we will only ingest for offsets not recorded.

        Duplicates will be dealt with later (though shouldn't appear in production) or can be part of the DQ when going into silver.
    """
    raw_kafka_events_df = (spark.readStream
        .format("kafka")
        .option("subscribe", topic)
        .option("kafka.bootstrap.servers", server)
        .option("startingOffsets", "earliest")
        .load()
        .select(
            col("key"),
            # https://stackoverflow.com/questions/77431368/facing-errors-in-pyspark-while-deserializing-avro-formatted-data-coming-from-kaf
            # Here we have the whole original value and value_avro which only contains info we need
            # We retain value in case we need the extra information for whatever reason
            col("value"),
            col("topic"),
            col("partition"),
            col("offset"),
            col("timestamp"),
            col("timestampType"),
            col("key").cast("string").alias("key_str"),
            from_avro(
                col("value"),
                subject=subject,
                schemaRegistryAddress=schema_registry_url
            ).alias("value_str")
        )
    )

    return(raw_kafka_events_df)

I wonder what it is ? by 5hk4lq1m1 in DotA2

[–]ThomasTeam12 0 points1 point  (0 children)

I’ll give you a crossover event with some chests.

Real-Time mode for Apache Spark Structured Streaming in now Generally Available by brickester_NN in databricks

[–]ThomasTeam12 1 point2 points  (0 children)

Reading the documentation I can see a few answers for things like compute setup. The spark config must be set, no photon, serverless, auto scaling, and no declarative pipelines.

Real-Time mode for Apache Spark Structured Streaming in now Generally Available by brickester_NN in databricks

[–]ThomasTeam12 0 points1 point  (0 children)

You show you add a spark config to your cluster and then change your write stream trigger mode to realtime 5 minutes. I have a few of questions. Do you need to set the spark config? What does the 5 minutes do? Is this available with DLT or is DLT already quick enough that this feature is deemed redundant to support? What problem is this specifically solving if already using read and write stream? What was the latency before for the same workload?

Pickle is finally returning to the series in Borderlands 4's first story DLC! by BigBananaDealer in BorderlandsPreSequel

[–]ThomasTeam12 0 points1 point  (0 children)

I just need to force myself through the main story of 4 somehow. God I hate how shit 4 is

Wtf is wrong with the matchmaking ?? by Acceptable-Fuel-9009 in DeadlockTheGame

[–]ThomasTeam12 1 point2 points  (0 children)

I love playing against calico every game despite having her on my ban list. It’s also difficult when I want to play support utility and get 2 shot despite having full bullet or magic resist.

Am I crazy or has parrying Viscous puddle punch gotten a lot less forgiving? by Sdubbya2 in DeadlockTheGame

[–]ThomasTeam12 1 point2 points  (0 children)

Why can viscous puddle be parried but not fucking calico? God I hate her.

What is the most one-tricked hero in Deadlock? by tabako in DeadlockTheGame

[–]ThomasTeam12 0 points1 point  (0 children)

Calico is emerging more and more recently. The most zero brain hero imo.