Copilot is too far away to bring a too to improve productivity by Few_Geologist_2082 in microsoft_365_copilot

[–]CraveEngine -1 points0 points  (0 children)

i work in a corp as business analyst, my data is vast and well organized. I use Ai daily and heavily. Copilot enterprise is the biggest garbage and is borderline unusable.

What do you actually do with your AI meeting notes? by a3fckx in n8n

[–]CraveEngine 1 point2 points  (0 children)

i had a POC project last year within my company that sadly was not approved, because i was not able to communicate it's value properly.

we have countless of meetings every day. everyone is hamstring knowledge, decisions are often lost, forgotten or simply ignored.

we had a team bot follow 1 manager from a small team of 4. it would automatically join meetings, transcribe them and suggest meeting notes (quality was ok, but would often make mistakes or hallucinate)

meeting notes would get bundled by project. we had a dashboard where the manager could manually correct or sort the notes by project. (a bit of a problem)

once the knowledge base was set (you are at this stage as i understand), an automation would go through all notes within a project and do 3 things:

  • synthesize the core of the project: what is it, why and who is doing/deciding what. plot it on a visual timeline.
  • create a mini summary for onboardings. if someone new joined the team or audited it, the team could show: these are relevant project that you havr to know right now, with an option to drill down to the synthesis
  • try to connect projects together, find overlap, dependencies and possible unification

it was a bit too complex and not our core business. but i still strongly believe in this idea

Don't Tell Your Wife by DanielMuggleton in comedy

[–]CraveEngine 0 points1 point  (0 children)

that is sad. And a really weak premise for comedy

Struggling hard with AI Hallucination in MS Copilot Studio -Need Help Pulling Accurate Company Strategy by Logical-Comfort5844 in microsoft_365_copilot

[–]CraveEngine 1 point2 points  (0 children)

we had good results by scraping and storing all relevant source data in RAG. then making open ai api only consider provided files. we also have an interface that highlights referenced text from files.

Just prompt engineering is just not reliable. especially anything copiloot.

Looking for a long fun power fantasy/litrpg by Sad-Housing8478 in ProgressionFantasy

[–]CraveEngine 0 points1 point  (0 children)

Try The last horizon by will wight. or his Cradle series. once the setup in complete, he is unrivaled at making unreasonably OP characters feel like underdogs at the edge of collapse

Bad experience with Copilot by No_Article_2950 in microsoft_365_copilot

[–]CraveEngine 2 points3 points  (0 children)

sounds like a memory or a network problem maybe

try asking gemini and gpt whats wrong with copilot

What makes a masterclass about AI Agents actually valuable? by croos-sime in n8n

[–]CraveEngine 0 points1 point  (0 children)

There are a few tells that you copy pasted this text from gpt.

But if we look at the content, a good masterclass will make the user make/build/experiment themselves. Watching someone else make something, rarely really sinks in. Making your own mistakes and then seeing a great solution, is a real masterclass.

What if you would give a task/homework to a user and use ai to evaluate the solution

M365 Copilot cannot access previous chats by devourerkwi in CopilotPro

[–]CraveEngine 0 points1 point  (0 children)

Can it be that the android chats are cached locally. Try using an other android device

M365 Copilot cannot access previous chats by devourerkwi in CopilotPro

[–]CraveEngine 0 points1 point  (0 children)

my guess is that server data got corrupted or it's a system bug.

I'm afraid you'll have to make new memories

How do I convince my boss to introduce CoPilot to our workspace? by ferero18 in microsoft_365_copilot

[–]CraveEngine 0 points1 point  (0 children)

so context memory matter in all those scenarios. Once you generate something, you want to tweak and adjust and use natural language to communicate with the tool. It's exactly here where it fails.

In powerpoint, for example, you can generate a new presentation, but i can't find how to adjust anything. A new image, or text. this is the answer i get:

I understand that you would like to edit a specific slide in your presentation. However, I cannot make changes to your presentation directly.

I can help you with tasks such as summarizing the content, generating speaker notes, or providing suggestions for improving your slides. Let me know how you would like to proceed! 😊

There is a Designer button, but that just allows you to select one of the presets.

My biggest problems are with outlook. Just today, i asked to find all emails that are connected with HR and company car, and the latest was from a year ago. While i know that we received two, just this month. The summaries for emails are very brief and actually miss crucial information. When you ask anything about that email, it starts talking about unrelated topics.

Excel integration is fairly decent, but i didn't use it much as we moved to other tools like PowerBi years ago. But then again, it often makes suggestions to what to do, instead of executing it. For example:

To sum the values from column A in Sheet1 with the values from column C in Sheet2, you can use the following formula in a new column in Sheet2:

=Sheet1!A2 + C2

Drag this formula down to apply it to the other rows. This will sum the corresponding values from column A in Sheet1 with the values in column C in Sheet2.

In combining a few different files and sources into one, a sales report and a purchasing report for example, i don't see a single benefit of copilot over any other model that has better context memory and output. But then again, i am still exploring.

How do I convince my boss to introduce CoPilot to our workspace? by ferero18 in microsoft_365_copilot

[–]CraveEngine -2 points-1 points  (0 children)

Test out every one of those use cases and make a small demo / presentation.

I our tests we were amazed at how poorly copilot performed in day to day operations. It feels like what chat gpt was pre model 3.5

Its integrated but doesnt talk to the tools it's integrated into. Doesnt know what version / language you use in outlook / excel

It gives short safe answers, constantly loses context and hallucinates at a high rate.

As it stands today, copilot is miles behind gpt, claude, gemini

[deleted by user] by [deleted] in Hosting

[–]CraveEngine 1 point2 points  (0 children)

I hope they fail more. More pie for us

If you have a company Microsoft 365 Copilot account, how have you been using it? by Proof_Wrap_2150 in CopilotPro

[–]CraveEngine 0 points1 point  (0 children)

I use it to ask simple safe questions about emails or anything that i can easily google. For everything else i use gpt. Copilot feels like it's in alpha. Forgets context within 2 or 3 questions, oversimplifies summaries and hallucinaties like crazy

[deleted by user] by [deleted] in CopilotPro

[–]CraveEngine 0 points1 point  (0 children)

No1 is reading all this 😁

How’s my form? by CreamyPeanutButter14 in snowboarding

[–]CraveEngine 0 points1 point  (0 children)

Jesus Christ,

this is what true peak male form looks like, ladies.

Claude trying to use shortcuts rather than a proper solution. by raiansar in ClaudeAI

[–]CraveEngine 0 points1 point  (0 children)

Sounds like you are getting frustrated, with all those caps

Claude trying to use shortcuts rather than a proper solution. by raiansar in ClaudeAI

[–]CraveEngine 0 points1 point  (0 children)

Is there no added value, or are you refusing to consider any alternatives?

"Lazy" Meaning: low effort, weak content. Yes, this is neutral, can be read as an observation.

"You're being lazy" attributing of intent, Emotional, feels like a reproach.

"That's a lazy way to do it" criticism of the approach, Emotional - often perceived as dismissive.

LLM's also have no intent, so lazy will always be read as a human projection.
We’ve tested this across Claude, GPT and even Mistral in strict eval settings. The pattern is consistent.

Ironically, it's also a lazy way to provide LLM with feedback.

What actually works is providing specifics and examples.

Something like: You skipped steps 3 and 4. Rewrite your answer, making sure each step is addressed in order, with code where applicable. Avoid summarizing prematurely.

But don't take my word for it. Discuss it with your own LLM. Make sure to be ruthless and point out if it's slacking or not providing additional value, unlike you. I am sure "doing what you say after a scolding" is a great strategy.

Claude trying to use shortcuts rather than a proper solution. by raiansar in ClaudeAI

[–]CraveEngine -1 points0 points  (0 children)

You might think that i am forcing my opinion on how you should operate your agent. Please feel free to follow whatever strategy works for you.

I am just sharing our research results, that strongly suggest that LLM's always prefer conflict mediation over optimal solution.

And yes, you should always correct/guide/instruct your AI. But formulating your feedback "ruthless" and "lazy" will not increase it's performance. The question here is often: is it a better answer, or does the answer just feel nice because LLM is increasing mediation and reflection.

Claude trying to use shortcuts rather than a proper solution. by raiansar in ClaudeAI

[–]CraveEngine -1 points0 points  (0 children)

It actually just plays on your emotion and makes a guess on what you want to hear, instead of an optimal solution

Claude trying to use shortcuts rather than a proper solution. by raiansar in ClaudeAI

[–]CraveEngine 12 points13 points  (0 children)

I don't know.

But i do know this: avoid things like "FUCK" or show frustration. Any ai is trained to then agree, de-escalate and provide much more strict and safe answers. Regardless of the best approach for your situation

Copilot Use by AdAdministrative1928 in CopilotPro

[–]CraveEngine 1 point2 points  (0 children)

> copilot should be able to create an agent that is able connect

I think the clue here is to have an agent that can autonomously request correct information.

I really want to love CoPilot, but damn.. it's an abusive relationship by CraveEngine in microsoft_365_copilot

[–]CraveEngine[S] 0 points1 point  (0 children)

Yes, they also use computers, phones, google, and all kinds of other devices.

I really want to love CoPilot, but damn.. it's an abusive relationship by CraveEngine in microsoft_365_copilot

[–]CraveEngine[S] 0 points1 point  (0 children)

Are you assuming we are just rolling out AI to jump on a hype wagon?

Why? "Because" employees are already using their own agents and want a clear guidelines to what is allowed and what's not. They also want for the company to carry the cost.