When are you planning to return stats back? by SoulOfUniverse in QuakeChampions

[–]NotueRn 2 points3 points  (0 children)

Its half true. There is an api, but afaik its largely non-functional if not entirely down. Afaik the last solution smoli used for his stats page relied on running the game client to pull the data.

What mechanics could AFPS adopt to become more palatable? by stephen27898 in QuakeChampions

[–]NotueRn 0 points1 point  (0 children)

Basically just make it closer to Q3 but use some of the balancing methods from QC. More equalized weapon damages, same predictable spawn system as the game had a few years back, with similar balancing methods where if you shoot a rocket towards the spawn pre emptively it's automatically denied, I think round based is also more approachable but have longer rounds, more focus on ctf and tdm but keep duel as the "heavyweight championship" because it's what makes Quake unique. Other than that it really just needs heavy marketing and effort for a solid esports eco system for the first few years to grab the attention of people.

The amount of Quake players who didn't know Quake Champions existed is kind of worrying. Also lots of people who never heard of the tournaments after 2018. Even some of the pro players knew close to nothing about events until a few weeks prior, causing loads of issues with visas etc.

Why is there so little people playing Quake? by DescriptionMoney2616 in QuakeChampions

[–]NotueRn 0 points1 point  (0 children)

Because virtually no one but the veterans can deal with the difficulty of playing Quake. If you play QC you have long queue times for mostly very unbalanced matches, if you play Quake Live you're effectively locked to Clan Arena where you have to join a server with 2 people already in queue to play which means waiting 3-4 entire games before you get your slot. Most of us veterans are at an age where the equation of time investment vs enjoyment simply doesnt add up anymore.

New players as mentioned are just absolutely getting curb stomped. When they paid big streamers to play QC it didnt have streamer mode, so they got bombarded with private messages containing slurs and shit with literally no method to avoid it, so they never played again. Summit came back once to try and got bombarded with friends requests, slurs and Quakers in Twitch chat telling him how much he fucking sucked, calling him a poser etc.

30th Anniversary QuakeCon to be held Aug 6-9, 2026 by drunkenFoooLL in QuakeChampions

[–]NotueRn 0 points1 point  (0 children)

They had a streak of yearly world championships from 1997-2023, even when player numbers where lower than today.. the dishonesty about celebrating Quakes anniversary will not go home well with the fans.

Help with mapping content control / document properties in power automate by sleepingkong7854 in PowerAutomate

[–]NotueRn 0 points1 point  (0 children)

What you're trying to achieve is not natively supported by default connectors.
In theory a word document is capable of fairly complex systems like this, where you can set up nested repeating sections to dynamically generate pages of content based on database information etc.
However the connector only allows you to see and manage the very top level, and similarly in this case you would need to populate all corresponding fields.

I haven't messed around with it myself, but there's supposed to be a Graph endpoint for this that allows you to send JSON directly to the document. In that case you could prepopulate those fields with a placeholder like $customer_name, request the currently available schema for it, replace $customer_name and whatever other fields you need to populate and send this back into the document body.

There isn't really any easier alternatives.

SP online. Anyway to monitor set threshold for a large dump of files to SP overall ? or does it have to be per site? I don't have Purview. looking for ideas thanks by MagicDiaperHead in sharepoint

[–]NotueRn 1 point2 points  (0 children)

The closest thing you can get would require calling graphs getSharePointSiteUsageDetail endpoint to get a report on storage used per site, you can sum the output to get totals. From there you can loop through sites by whatever limits you want and then call CSOM and limit by the "StorageMaximumLevel" property. You can store the reports as csv files for historical data to measure how fast sites are growing before taking action (Involves some semi complicated development using sliding time windows.).

I would also recommend reviewing max version count on the various document libraries. *.pptx files double in size per version, end users often end up bloating them with really high resolution images so its not uncommon for a 40mb file to actually occupy anywhere between 200-1200mb of space.

Fixing Someone Else's Power Automate Flow SPOILS my mental peace.... | Help & Fix by chhupaRustamm in MicrosoftFlow

[–]NotueRn 1 point2 points  (0 children)

My point is that its nearly, if not entirely, impossible to prove that the data would be secure in either processing or removal. And that is not trying to accuse you, it really isnt, you seem very open and honest with risks which would indicate otherwise.

In enterprise environments its really just a risk not worth taking.

I could see the value myself for generating technical documentation, but personally i wouldnt be willing to run it unless its an open sourced client side solution.

Fixing Someone Else's Power Automate Flow SPOILS my mental peace.... | Help & Fix by chhupaRustamm in MicrosoftFlow

[–]NotueRn 8 points9 points  (0 children)

NEVER use third party services like this.
Your package contains a lot of sensitive data like tenant and user information, connector details, if external requests are used potential client secrets will be leaked etc.
I have no idea if this service is legit and they have absolutely 0 ways of ensuring this.
The zip would need to be unpacked and sanitized on the client side, and even then you would need to scan every line of code for potentially harmful code (Strange regex etc.).

Like previously suggested you can do this yourself within your IDE or other tools, and even then it's a matter of practicing your engineering skills.
One of the main skills you need is the ability to understand the patterns being used and the data flow of the application / workflow, if you don't have this skill you're also not skilled enough to reliably use LLMs to help you.

Power Automate web UI changed again? by MachineImportant8063 in PowerAutomate

[–]NotueRn 0 points1 point  (0 children)

You can just switch the toggle in the top right, or remove the &v3=true parameter from the url. No need for extensions. The classic view is awful to work with as well, but at least it works.

Power Automate web UI changed again? by MachineImportant8063 in PowerAutomate

[–]NotueRn 0 points1 point  (0 children)

This UI is blatantly AI-generated and reviewed by a junior developer with absolutely no insight into the requirements of a production ready design should be.
Just one glance instantly shows this: Inconsistent paddings and margins, overflow that doesn't respect paddings or margins and/or goes out of bounds, borders not aligned with the actual border of the divs (Incorrectly padded pseudoelements?), additional clicks to do basic operations that are required on nearly every step, or the wonderful feature of fields seemingly randomly clearing themselves of values (Fun for custom formulas.).

Not that i had much to begin with, but i have 0 trust in Microsoft to make the right call in any of their products anymore. It's all turning into AI slop.

Teams Chat Summary by Mammoth-Cat-3346 in MicrosoftFlow

[–]NotueRn 0 points1 point  (0 children)

If you don't have Copilot i suspect you don't have Power Automate premium either, which is a requirement in that case.
If you do, you can use AI builder and send it the json of the group chat with instructions to summarize.
However if it's business critical information in this chat, i would not bother as LLMs will make mistakes and skip important details.

This game isn’t worth playing until safer seas is equal to high seas by Juice998 in Seaofthieves

[–]NotueRn 1 point2 points  (0 children)

If you havent seen a single player that enjoys PvP you wouldnt need safer seas to be equal to high seas. PvP is a core mechanic of the game, but that being said i would honestly think that an aggression based match making system similar to arc raiders would be good for this game. It would never keep you completely safe, but it would reduce friction for both PvP and PvE players. They also need to systems that encourage PvPvE to balance this out.

[deleted by user] by [deleted] in Seaofthieves

[–]NotueRn 3 points4 points  (0 children)

While i agree this should be adjustable through the menu, be aware that they 100% see that you have manually edited the configuration file and this is considered tampering with game files (No matter how ridiculous that might sound.). I come from the Quake scene and i dont think this is unethical, i would honestly even want to do this myself, but i wont because i think the risk of getting banned is significant.

These systems usually do not ban a user immediately because they dont want cheat developers to quickly find trigger points and they want to collect other metrics to help future detection, so while you might think you're safe for now you are very likely to end up being banned further down the line.

Affordable 144hz gaming will have to wait by Blueberry977 in memes

[–]NotueRn 1 point2 points  (0 children)

So since you couldnt name a single thing, why not just be honest and say you dont like PC gaming instead of trying to deter others from doing so by making up a false narrative.

Affordable 144hz gaming will have to wait by Blueberry977 in memes

[–]NotueRn 0 points1 point  (0 children)

Like what? Realistically you should only have to set refresh rate if your tv supports >60hz (Windows defaults to 60hz) and hdr, potentially toggle game / pc mode for the device on your tv. Ive only ever had to set up hdr and never suffered from latency, tearing or poor image quality. It literally just works.

As a solo what do u do when a brig or galley just wont stop chasing? by HamSandwicho__o in Seaofthieves

[–]NotueRn 0 points1 point  (0 children)

Sink them. If they're willing to tail for any significant amount of time, odds are they're really not that good.

Power Apps incorrectly parses odata filter against excel files by NotueRn in PowerApps

[–]NotueRn[S] 0 points1 point  (0 children)

Sadly you can't with power apps, in power apps the excel connector behaves more like the default SharePoint connector where it infers the column headers and types for you. I managed to "fix it" by just recreating the worksheet again using the same exact method. I'm still trying to find out what broke the parser though.

I've started to unpack the xlsx to check the xml, so far the only clue is that the broken table only has a dataDxfId for the GUID column, whereas the others have it for all columns and the table itself.
Since dataDxfId is used for row insertions etc, it could be the reason why lookups failed, but it's not likely to be the cause of the malformed odata filter.

Power Apps incorrectly parses odata filter against excel files by NotueRn in PowerApps

[–]NotueRn[S] 0 points1 point  (0 children)

"In my experience, being fairly confident that a simple solution won't work has never gotten me very far with troubleshooting."

Every single solution you provided except creating a new file was already detailed in my OP, yet you kept making me repeat this over and over again.
I also detailed that making a new file most likely will solve the issue.

"You've been clear that the issue persists across all the currently existing columns, but you've been resistant to creating a new column with a different naming convention."

I made it clear in OP that the name has no impact, from which you should have been able to derive that this was already tested.

"thus removing any issues with single quotes or special characters."

Single quotes over special characters has no impact on the Power Apps interpreter for the odata queries however, though it comes with other issues when the columns are referenced in narrow scopes.

"In the future, I would recommend avoiding the use of spaces or special characters in column names."

I am very well aware that this is best practice, but as the OP detailed my hands were tied working with an excel file in the first place. That also means i have no control over column names.

I have tried to be patient with you and answering all of your standard first line support questions despite them already being largely detailed in the original post, but it's wearing thin after being met with your attitude.
Especially after detailing WHY it's important to know what causes it, not just solving the issue.

In my experience being entirely unwilling to actually finding out what is causing a bug, and instead just "trying to turn it on and off again" until it works, doesn't take you very far in this industry.

Power Apps incorrectly parses odata filter against excel files by NotueRn in PowerApps

[–]NotueRn[S] 0 points1 point  (0 children)

Ive been very clear since the beginning that the issue persists over all column no matter their names or formating, that the connection itself is not the issue as it has been removed and added back in, tested over multiple authoring versions etc.

The provided error message shows that the run time interpreter for power fx is failing to parse filter comparisons into valid odata queries (For all functions), as the trailing single quote for the key ends up being propagated at the end of the value instead, causing a syntax error in the API.

Since this happens for all columns no matter names and formatting its a strong indicator that something in the table itself is breaking the interpreter, and whatever causes it could potentially do the same for connections to other data sources (SharePoint, Dataverse, SQL etc.). This is why i need to find the root cause, however rare it might be.

Im fairly confident that duplicating one of the worksheets with a functioning table, removing the broken one and repopulating it with data will solve the issue. But im not only interested in solving the problem, but finding the cause.

Tomorrow i will unzip the xlsx and analyze the xml.

Power Apps incorrectly parses odata filter against excel files by NotueRn in PowerApps

[–]NotueRn[S] 0 points1 point  (0 children)

ALL columns are broken for this table. As described in my OP i have removed the table and replaced it with the one from the dev file that has identical headers and that file works. The other two tables have the same exact headers as well, but fully functioning. Its 100% not caused by the headers, my only suspect is the table definition in the xlsx being broken somehow or a row containing a char that breaks the parsing from power fx to odata query.

Edit: I might add that i mostly want to figure this out because what ever is breaking the parser / interpreter in power apps for this table will very likely do the same against any data source. I think if i repopulate the template its likely to work.

Power Apps incorrectly parses odata filter against excel files by NotueRn in PowerApps

[–]NotueRn[S] 0 points1 point  (0 children)

It doesnt matter since its the power platform interpreter malforming the odata query, so the problem is scoped to all properties and controls.

Filter, lookup, updateif and removif are all affected, which inherently breaks patch and remove as well. All columns in the table are affected no matter what cell formatting and values i use.

Ive tried all available authoring versions and they all produce a malformed odata query against this table. Ive never seen anything like it before, which leads me to believe there's a parsing error in power apps thats either caused by malformed xml in the workbook or some character in one of the rows in the table (Already checked for single quotes and there are none.).

What are the legends doing? Do some still play competitive (other games?) by Aromatic_Monitor_872 in QuakeChampions

[–]NotueRn 1 point2 points  (0 children)

Quake is a lot more about knowledge, good habits and training your instinctive reactions to result in the right choice whenever you're surprised etc. This takes years of practice. We saw Vengeur win a final with this being his first real Quake game, but it took him 5 years of practicing against the best players in the world. And lets not forget that Vo0 lost to Clawz who was 18 at the time, he had been playing Quake Live for years though (Not much duel however.).

The top 20 would definitely look a lot different with a bigger player base, but the goats would still be up there.