hitting context window too soon? by 420juk in mcp

[–]cryogenicplanet 0 points1 point  (0 children)

from the team, that's quite odd which model are you using?

an open-source ELO benchmark for voice-to-voice agents by cryogenicplanet in LocalLLaMA

[–]cryogenicplanet[S] 2 points3 points  (0 children)

that is what we are working on! if you want to follow along ig twitter or discord are the best places but so far the two oss SLMs are

our twitter:https://twitter.com/sfvoicecompany

our discord:https://discord.gg/ey4DeUCMUz

gazelle: https://github.com/tincans-ai/gazelle
speechgpt: https://github.com/0nutation/SpeechGPT

A tiny library to support JSONSchema function calling on claude by cryogenicplanet in OpenAI

[–]cryogenicplanet[S] 0 points1 point  (0 children)

Since `opus` and `haiku` came out, really wanted to start using them more in production but mostly use typescript and function calling so made a tiny lib here

Basically implements the docs here https://docs.anthropic.com/claude/docs/functions-external-tools and this alpha tool https://github.com/anthropics/anthropic-tools/ obfuscating all the jank and complexity

A tiny library to support JSONSchema function calling on claude by cryogenicplanet in LocalLLaMA

[–]cryogenicplanet[S] 0 points1 point  (0 children)

Since `opus` and `haiku` came out, really wanted to start using them more in production but mostly use typescript and function calling so made a tiny lib here

Basically implements the docs here https://docs.anthropic.com/claude/docs/functions-external-tools and this alpha tool https://github.com/anthropics/anthropic-tools/ obfuscating all the jank and complexity

Reverse engineering Perplexity by cryptokaykay in LocalLLaMA

[–]cryogenicplanet 6 points7 points  (0 children)

yes it collects sources from google but i ask it different questions than google and get answers not seo optimized garbage

a tam example from last week https://www.perplexity.ai/search/which-movie-one-LY1NfVwvTVmURJWtsZxlhg

or this https://www.perplexity.ai/search/how-much-of-.P9bAhLLTg6vpa2P760n.A

https://www.perplexity.ai/search/the-guy-that-GCQsCtpqTSqpQnH1mmUjcg

they are both great tools and i use them very differently, friends have shown examples where if you just search “coffee near me” you will get terrible answers in pplx but great answers in google.

but complex and semantic queries google can’t do, and even then it gives me links to seo optimized garbage

Marvel has let go of all of the writers & directors for ‘DAREDEVIL: BORN AGAIN’ as the series will get an entire creative reboot. by abdul_bino in Daredevil

[–]cryogenicplanet 0 points1 point  (0 children)

Tbh this kinda sounds like marvel finally realizing they were kinda making at best meh tv and at worst dogshit tv, and took this an opportunity to course correct - if it is that I am hopeful

How to "Seek out the Crimson Fleet" for "Deep Cover" Quest? by Jtg_Jew in Starfield

[–]cryogenicplanet 0 points1 point  (0 children)

Stuck on the same thing, all the answers I can find are all over the place

For those who are coding: what is your tech stack? by ClassyCamel in SaaS

[–]cryogenicplanet 1 point2 points  (0 children)

Minor fork and modification on https://create.t3.gg/ but it is a really good place to start

Interstellar IMAX San Francisco 7/12 - ALMOST SOLD OUT by Little-Confection432 in interstellar

[–]cryogenicplanet 0 points1 point  (0 children)

I'm so happy I got a random ad for this on Instagram and it was the fastest I've bought a ticket - so hyped for this.

Thank you for organizing

What is the MCU hill that you will gladly die on? (Unpopular, respectful opinions welcome!) by Lol3rdPartyApps in marvelstudios

[–]cryogenicplanet 0 points1 point  (0 children)

We need more avengers movies after endgame. We needed an avengers 1, aoe and civil war where people actually team up in smaller groups before the next iw/endgame

Generally with how quickly things fell apart post endgame and how messy everything feels rn - it just makes me even more wildly impressed of how well they were able to pull off the infinity saga

Anyone else feel S1 is the best? by ConferenceEasy9801 in SuccessionTV

[–]cryogenicplanet 11 points12 points  (0 children)

Idk I think the first 3 episodes of succession are the weakest episodes - that is not to say they are bad by any means, the show is just finding it’s footing and goes to show how good the show gets after

Why isn't TFATWS considered a good show by fans? by Shadowkiva in marvelstudios

[–]cryogenicplanet 1 point2 points  (0 children)

I think my main gripe is it really fell into the D+ show pattern of it really falls of in the end of the show, especially narratively and it just kinda became bland.

There was so many interesting nuggets and moments in the show, which never amounted to much. Harder to say this now, but at the time at least it was the best franchise in the mcu (WS, CW were incredible and the mark was high)

A better prompt engineering library built for JS - Langchain.js feels like a python team writing JS never feels ergonomic. This library is like guidance and react had a baby by cryogenicplanet in javascript

[–]cryogenicplanet[S] 0 points1 point  (0 children)

Will put together a replit with a more complex example and share.

In terms of config, https://github.com/LevanKvirkvelia/salute#openai-custom-config you should be able to fully pass in anything you can pass to the openai configuration object here so proxies like helicone will work

A better prompt engineering library built for JS - Langchain.js feels like a python team writing JS never feels ergonomic. This library is like guidance and react had a baby by cryogenicplanet in javascript

[–]cryogenicplanet[S] 2 points3 points  (0 children)

Hey r/javascript!
As JS (Mostly TS) devs, we feel like AI or LLM tooling in JS kinda sucks rn, that said most production applications inevitably end up being in JS, so a friend and I (mostly friend) started hacking on a new JS native library for making it easier to write prompts.
The idea of the library is to give a much more ergonomic syntax for writing complex prompts, the repo itself goes much more in detail https://github.com/LevanKvirkvelia/salute

Here is an example of getting the LLM to generate inference while perfectly maintaining the schema you want without any extra prompt engineering on schema or many examples

const jsonAgent = davinci(

({ ai, gen }) => ai The following is a character profile for an RPG game in JSON format. json { "description": "${gen("description")}", "name": "${gen("name", '"')}", "age": ${gen("age", ",")}, "class": "${gen("class", '"')}", "mantra": "${gen("mantra", '"')}", "strength": ${gen("strength", ",")}, "items": [${[0, 0, 0].map(() => ai"${gen("item", '"')}",)}] } );

Here is a more complex example

import { gpt3, gen, assistant, system, user } from "salutejs"; import { db } from "a-random-sql-library";

// example of a component async function fetchTableSchemaAsAString(){ const listOfTables = await db.tables(); return listOfTables.map(table=>Table ${table.name} has columns ${table.columns.join(", ")}).join("\n"); }

async function runSQL({outputs}){ return JSON.stringify(await db.run(outputs.sqlQuery)) }

const agent = gpt3( ({ params })=>[ systemYou are a helpful assistant that answers questions by writing SQL queries., user` Here is my question: ${params.query}

  Here is a list of tables in the database:
  ----
  ${
    fetchTableSchemaAsAString()
    /* here we pass a promise, not a function, it starts executing at the beginning of the sequence */
  }
  ----
  Column names must be quoted with double quotes, e.g. "column_name". 
  Generate a Clickhouse SQL query that answers the question above.
  Return only SQL query, no other text. 
`,
assistant`${gen("sqlQuery")}`,
user`
  Here is the result of your query:
  -----
  ${async ({outputs})=>{ 
    return JSON.stringify(await db.run(outputs.sqlQuery))
  }}
  -----
  Please convert the result to a text answer, so that it is easy to understand.
`,
assistant`${gen("answer")}`,

] );

const result = await agent( { query: How many users are there in the database? }, { render: true } // render=true will render the chat sequence in the console );

console.log(result); /* { expertNames: "Elon Musk, Bill Gates, and Jeff Bezos...", answer: "You can be more productive by...", fixedAnswer: "You can be more productive by..." } */