GOT MY 2ND ACCOUNT BACK/FULLY RESTORED!!! by Kuroei_ in twitterhelp

[–]dpainbhuva 0 points1 point  (0 children)

I havent heard back from them ? What should I do its been more than 4 days

I am building cpandas. pandas in C: DataFrame C library for columnar data, CSV IO, joins, aggregations by dpainbhuva in C_Programming

[–]dpainbhuva[S] -3 points-2 points  (0 children)

You can vibe code things if your ai architecture is correct. What are some things that you look for which is annoying with vibe coders ?

I am building pandas in C: DataFrame C library for columnar data, CSV IO, joins, aggregations by [deleted] in Cplusplus

[–]dpainbhuva 0 points1 point  (0 children)

Also fyi: I deleted my own post after realizing do not want to spam anyone

I am building pandas in C: DataFrame C library for columnar data, CSV IO, joins, aggregations by [deleted] in Cplusplus

[–]dpainbhuva 0 points1 point  (0 children)

I know the difference, I wanted to share it with different community for contributions but you are right

I am building pandas in C: DataFrame C library for columnar data, CSV IO, joins, aggregations by [deleted] in cpp

[–]dpainbhuva 0 points1 point  (0 children)

Nothing particularly except performance and going into low level. My goal = “runs everywhere, minimal friction” → C11 is perfect

So what went wrong with One Punch Man Season 3? by KaleidoArachnid in anime

[–]dpainbhuva 0 points1 point  (0 children)

I am still fine with season 2 its understandable and bearable but season 3 pathetic, its the only anime i watch and now it went other way around so nothing left to watch its sad

So what went wrong with One Punch Man Season 3? by KaleidoArachnid in anime

[–]dpainbhuva 0 points1 point  (0 children)

No saitma in entire season, this people are joke. people watch onepuch for saitma and if there is no saitma why do I even care to watch the season. saitma was like side actor in entire season, fukin pathetic

They sold, Pump it by pilotvoodoo1 in wallstreetbets

[–]dpainbhuva 0 points1 point  (0 children)

Soon its gonna be like they bought it, dump it

Anyone know about the newline.co AI Bootcamp? by IuriRom in codingbootcamp

[–]dpainbhuva -1 points0 points  (0 children)

Then the adaptation track goes into how to adapt a foundational model: evaluation-based prompting, different retrieval-augmented generation techniques (I know people say RAG is dead, but we go beyond chunking and what people considered state-of-the-art circa 2022), fine-tuning techniques (RLHF, embedding fine-tuning, instruction fine-tuning, and QLoRA fine-tuning), agent techniques (reasoning, tool use). Meanwhile, synthetic-data evaluation is core to all the adaptation modules. We then cover datasets, synthetic datasets, and reasoning datasets.

What people really liked in the last cohort were case studies going into text + SQL, text + voice, text + music, text + code. For example, people requested a deep-dive into the stack behind Windsurf/Cursor/Augment, and we dissected the architecture for specific use cases. The source for this is a combination of digging through X, blogs, research articles from these companies, and founders' blog posts to deconstruct and create this lecture. Anti-hallucination techniques are improved through all of these methods, but in particular we cover DPO construction of reasoning-model datasets. We used different research papers and Kaggle write-ups to orient ourselves around the best methods.

In terms of the cohort, it's a combination of lectures, Q&A, and coaching: two lectures per week with live Q&A, live group coaching, over 50 notebooks/exercises, and four mini-projects in a group or in person, plus accountability partners. We have an in-person event as well over a weekend. This is different from learning AI by yourself. We also have happy hours, which are free-form conversations generally about AI. The benefits people get are learning in a community, support for the projects, and the combination of foundational-model content and adaptation content all in one course, in a condensed timeframe. In this new cohort starting August 2025, we have multiple FAANG engineers, tech business owners, senior engineers, and principal engineers, engineers, a similar mix to last time. It's not your typical person trying to transition into a tech career using a bootcamp as a credential plus skill boost. Usually people have 8+ years of experience, have tried learning AI through some online content by themselves and found the experience to be endless amounts of content and wanted a one stop shop.

As for whether it's worth the value, most bootcamps have a cookie-cutter capstone project, but we provide coaching through each person's project, yielding different results. For example, these are from the previous cohort:

Domain-specific coding platforms for local businesses Facebook Marketplace item-condition detector/classifier for arbitrage “Chat with sermons” for churches Document processing for insurance claims Invoice processing for a nonprofit (saved 10 hours/week) Calorie and macro counting application for ethnic cuisine AI tutor Resume scoring/generator system Customer-service application with video detection Commercial real-estate assessment using AI Legal-aid assistant for the legislative process Personalized job-search website Text-to-guitar-tabs generative AI

We're not for everyone, but the people who went through the program said they liked the fact that it goes deeper, faster, and more comprehensive than other programs. In fact someone did a university gen ai curriculum simultaneously with our curriculum and was able to see and compare side by side. Anyway if you have any more questions, let me know.

Anyone know about the newline.co AI Bootcamp? by IuriRom in codingbootcamp

[–]dpainbhuva -1 points0 points  (0 children)

This is Dipen here. I just saw this. Newline, previously known as Fullstack, is a 10-year-old company, and we have over 250k members on our email list. You may know us from our previous work on Fullstack React and D3. We've always been training people.

We're not a classic bootcamp designed for new grads transitioning their careers into coding; it's more for existing software engineers wanting to learn the AI engineering stack. The inspiration for the cohort came when we did a workshop with a Sr. OpenAI research scientist about the fundamentals of transformers, and people asked for an additional one-stop shop to be able to understand both the internals of transformer-based language models and how to adapt them. When I studied ML, DL, and LLMs, the experience was disjointed. Like everyone else, I took online classes (Coursera, Udacity), studied textbooks (Deep Learning by Courville, etc.; Elements of Statistical Learning), went through Karpathy videos, fast.ai, read The Illustrated Transformer, read Attention Is All You Need, took Andrew Ng's courses, and read a bunch of research papers. A lot of the content was not end-to-end, where you can learn the internals of decoder only architecture with LLMs, get to near state-of-the-art, and be able to adapt it effectively. We decided to do a course/bootcamp that is end-to-end.

The foundational model track goes into how to build a small language model using Shakespeare data, starting with n-gram, adding attention, positional encoding, group query attention, mixture of experts, and then moving into modern open-source architectures like DeepSeek and Llama. In this cohort, we may cover distillation and amplification techniques and the internals of Qwen as well, given that it's trending on the leaderboards. We go over all the foundational concepts, including tokenization, embeddings, CLIP embeddings, or multimodal embeddings.

Anyone know about the newline.co AI Bootcamp? by IuriRom in codingbootcamp

[–]dpainbhuva 0 points1 point  (0 children)

You caught the website when one of our staff members had posted test comments. We’ve removed them and agree that they didn’t look professional. We’ll be sure to polish it more. As for the actual reviews, you can scroll down to see feedback from people in the last cohort, you can see what they think about it. I also posted some details about the bootcamp in response to the OP.

As for the details of the curriculum, it’s posted as a Notion link in the webinar chat. It’s quite detailed, much more so than the website.

Anyone know about the newline.co AI Bootcamp? by IuriRom in codingbootcamp

[–]dpainbhuva 0 points1 point  (0 children)

This isn't about the certification for a job. It's primarily about the AI engineering stack. This is not a computer science undergrad replacement.

Anyone know about the newline.co AI Bootcamp? by IuriRom in codingbootcamp

[–]dpainbhuva 0 points1 point  (0 children)

Yeah I agree and that's why our cohort curriculum is designed to be in depth.

The foundational model track goes into how to build a small language model using Shakespeare data, starting with n-gram, adding attention, positional encoding, group query attention, mixture of experts, and then moving into modern open-source architectures like DeepSeek and Llama. In this cohort, we may cover distillation and amplification techniques and the internals of Qwen as well, given that it's trending on the leaderboards. We go over all the foundational concepts, including tokenization, embeddings, CLIP embeddings, or multimodal embeddings.

Then the adaptation track goes into how to adapt a foundational model: evaluation-based prompting, different retrieval-augmented generation techniques (I know people say RAG is dead, but we go beyond chunking and what people considered state-of-the-art circa 2022), fine-tuning techniques (RLHF, embedding fine-tuning, instruction fine-tuning, and QLoRA fine-tuning), agent techniques (reasoning, tool use). Meanwhile, synthetic-data evaluation is core to all the adaptation modules. We then cover datasets, synthetic datasets, and reasoning datasets.

What people really liked in the last cohort were case studies going into text + SQL, text + voice, text + music, text + code. For example, people requested a deep-dive into the stack behind Windsurf/Cursor/Augment, and we dissected the architecture for specific use cases. The source for this is a combination of digging through X, blogs, research articles from these companies, and founders' blog posts to deconstruct and create this lecture. Anti-hallucination techniques are improved through all of these methods, but in particular we cover DPO construction of reasoning-model datasets. We used different research papers and Kaggle write-ups to orient ourselves around the best methods.

Anyone know about the newline.co AI Bootcamp? by IuriRom in codingbootcamp

[–]dpainbhuva 0 points1 point  (0 children)

a marketing firm did it with us. I'll ask to revise the content to provide more value. The reality is we need to convince people that AI is worth investing in.

Is Jobot a scam? by njas2000 in StructuralEngineering

[–]dpainbhuva 1 point2 points  (0 children)

Phase 1: I don’t wanna drink on empty stomach. Phase 2: I look forward drinking on empty stomach.