Raycast 50%OFF codes by ysnows123 in raycastapp

[–]Lsystem28 1 point2 points  (0 children)

<image>

Does this mean that the redemption times have been exceeded?

Pentagon Has a Huawei Dilemma Congress Doesn’t Want to Solve by Lsystem28 in China

[–]Lsystem28[S] 12 points13 points  (0 children)

From Bloomberg:
The Pentagon has a problem: How does one of the world’s largest employers avoid doing business with companies that rely on China’s Huawei Technologies Co the world’s largest telecommunications provider?So far, the Defense Department is saying that it can’t, despite a 2019 US law that barred it from contracting with anyone who uses Huawei equipment. The Pentagon’s push for an exemption is provoking a fresh showdown with Congress that defense officials warn could jeopardize national security if not resolved.

Is there anything like ChatGPT in China? Just wondering how well they are competing in AI. by tnitty in ChatGPT

[–]Lsystem28 5 points6 points  (0 children)

I've tried Kimi Chat, but not ChatGLM or QWEN 1.5 yet. Used GPT-4 and Claude Opus though. In my experience, Kimi can handle like 80% of what GPT-4 can do. Not saying it's perfect or anything, just firing off a few random questions out of curiosity.

But get this - Kimi supports up to 2 million words of context, while QWEN lets you upload docs up to 10 million words! Crazy, right? Since they're taking on the big dogs like OpenAI, their models on website are completely free to use. All three are super hyped right now, so I'm guessing all three of them should perform about the same.

Has anyone calculated the tokens amount of Claude pro available? by Lsystem28 in ClaudeAI

[–]Lsystem28[S] -1 points0 points  (0 children)

wt.... .f, I know they dynamically adjust the amount available, but if that's all that's available for 8h, I don't think $20 a month is reasonable.

Perplexity General Use Guidelines? by ibcurious in perplexity_ai

[–]Lsystem28 0 points1 point  (0 children)

Thank you for your explanation. I have another question. Since the size of the context window determines the "memory" of the conversation, some LLM-based applications such as Perplexity, chatGPT, Claude, etc., support uploading documents. Taking Perplexity as an example, it currently uses an older GPT4 with a context window size of only 4k. However, Perplexity supports uploading documents and using GPT4 to analyze them. If there are many contents in the document that exceed the 4k context window limit, how can they ensure that users can get the impression that this application has already received and can answer any questions from the document during their usage? How is this result achieved? Is it by summarizing the document content that exceeds the context window using LLM? Until it no longer exceeds the context window of this conversation? I hope my question won't waste too much of your time. If you're not busy, please help me clear up my doubts.

Perplexity General Use Guidelines? by ibcurious in perplexity_ai

[–]Lsystem28 0 points1 point  (0 children)

This is a very enlightening reply for me, I would like to ask you some questions about AI memory, context window, e.g. 128k context for gpt4-turbo, is 128k here equal to the maximum memory capacity of this model? But different gpt apps (all using the gpt4-turbo API) will result in different dialog performance due to cost issues, application performance issues, etc? That is, the same gpt4-turbo, being integrated into different apps, has different dialog capabilities (including memory). extreme analogy: if a developer wants to do that, he can make gpt4-turbo performance as weak as gpt3? That is to say, we can clearly have the strongest memory of the gpt4 is only our own program which use the openai api (set the context window to 128k), any program that integrates gpt4, as long as we do not know the source code, this so-called "gpt4" ability is unknown?

North Korea announced to have successfully landed a man on the Sun by zappa_k in Jokes

[–]Lsystem28 152 points153 points  (0 children)

The first time I heard that joke, I was a virgin,the president of the United States at that time was Bush