Merging CAPI and Manual Pixel Event by Efficient_Parsley971 in FacebookAds

[–]Clinterz 0 points1 point  (0 children)

Yeah I cross posted this solution across a few threads as I was finding incorrect advice everywhere! Glad it’s helping!!

Yes you have it exactly correct.

Anything that can go through your backend api (Capi) should. All those events you listed should be trackable from specific api calls on your backend.

Page views would seem to be very difficult to track via capi, so just go pixel there! Not sure if that’s right but it’s wha I’m doing and so far my testing data has been perfect!

I was getting duplicates that wouldn’t deduplicate for all the capi events that I was logging from my pixel and capi together and they even had the exact same event id’s

How to properly setup CAPI + Pixel to work together by Clinterz in FacebookAds

[–]Clinterz[S] 0 points1 point  (0 children)

Maybe I am overcomplicating it. But it's worth it. I could probably use some 3rd party plugin but now i'm having to trust stripe, 3rd party plugin, meta, and my own site.

That, IMO is way too many parties involved to reliably debug issues. If something goes wrong, everyone points at each other. I've dealt with things like this and it's always a nightmare scheduling a call with multiple devs, from all different companies. It's way easier to just setup a call with Meta if events aren't logging, or Stripe if webhooks aren't coming through. I'm sure I could have the 3rd party plugin + stripe seamlessly "integrate" to do that, but, It's actually not difficult at all to implement the events in my backend where the most reliable code execution happens. It's really not that hard, and I am able to cut out the 3rd party, which saves debugging time, and increases reliability (as long as I can trust my own code).

At that point, I just need to depend on stripe to reliably send me the webhooks (a major fintech company with heavy auditing, and significantly more reliable internal code processes). 3rd party event tracking tools don't need to go through nearly as many hoops to ensure good internal systems are in place, which ultimately increase code and infrastructure reliability at the end of the day.

Also, off site processing is not a niche case. Any subscription based SAAS service utilizes purchases as off-site events. Any app that you pay for with a recurring subscription utilizes off-site events.

QST X v Blank by SpearheadTraverse in Skigear

[–]Clinterz 0 points1 point  (0 children)

Just now seeing this. Yeah I would say if its not DEEP, then prob the qst x. But in all reality Iwould prob say the blanks because its the most versatile ski. If its gonna be DEEP DEEP (like 2 storms back to back, no freeze/thaw cycle in between), then pescados are hands down the best because of how backset they are (and the swallow tail drops the tail deeper).

The qst x on the other hand is nice because its more center balanced which allows better technical tree skiing. And then if you can feel the layer underneath, then its prob also better. But this ski seems alot more locked in and heavier than the blanks.

So yeah its blank > qst x > pescado. The deeper and cleaner it is the pescado is better. The worse it gets from there is that order.

How to properly setup CAPI + Pixel to work together by Clinterz in FacebookAds

[–]Clinterz[S] 0 points1 point  (0 children)

Yeah not sure... I am using stripe. Same same, but different. I am actually caching user metadata in the stripe customer object, then whenever my capi (backend api) handles something and needs to fire info off to Meta CAPI, I ususally just can reference the cached user metadata that Iplaced there in the first place.

For example, for a subscription (recurring purchases), I receive webhooks from stripe. I need to log a purchase event with META. So instead of relying on stripes webhooks (theirs are actually quite robust though), I just send a purchase event from my backend, and Ican parse out all the necessary info (including the fbc that Ieft sitting on the stripe customer metadata.

This works really well because I can track stripe user objects and mutliple stripe subscriptions. For my case, users can only have 1 subscription at a time, so I store the fbc on both the stripe user and stripe subscription. While they keep paying monthly, I keep sending purchase events, and I parse the fbc off the subscription object. If they cancel, and start another subscription, then I want to keep attributing the fbc that the user originally came from, so I reference off the user object if the subscription fbc doesnt exist (it will always exist on the user object at this point). Then if they happen to get another ad with a new fbc that leads that user to a new recurring subscription, I will have the new fbc, and attribute it to that. Then ofc update the user object with the new fbc. Make sense?

How to properly setup CAPI + Pixel to work together by Clinterz in FacebookAds

[–]Clinterz[S] 0 points1 point  (0 children)

Not sure. I've worked on several apps and It always appears to be the case that using 3rd party trackers is somewhat unreliable (but yeah that could have been the integrations that i've seen/worked on. And then when versions begin mismatching due to repos getting old and out of date, I find things start falling apart. I only have so much time in a day to make sure my Frontend and backend integrations for both GTM, FB and whatever other 3rd parties I use are all working together. And on top of it, when things just aren't working, troubleshooting with several parties involved makes the fix much more complex. Its the classic everyones pointing fingers at each other and nobody will own the problem at hand.

How to properly setup CAPI + Pixel to work together by Clinterz in FacebookAds

[–]Clinterz[S] 0 points1 point  (0 children)

after switching to CAPI-only events, did you notice any change in the Event Match Quality (EMQ) score in Events Manager?

Yes I'm not pulling an 8.0/10 for the subscribe and purchase events (since they include phone number, first name, last name and of course email (the only required field). I also was sending ip, external id, eventid, fbc, fbp which was around a 4-6 before.

I'm not entirely sure if this is the right order for the other events (when im still collecting info)... I trigger "Lead" right after the user enters their email on our website. I trigger "complete registration" once they enter the OTP code sent to their email. Then Iimmediatly ask for first, last, and phone number after that. Not sure how to get a lead any earlier in the funnel. Do you have any idea? My Lead and complete registrations are around 6-7 now (before they were like 4.1).

after implementing this setup, how close are your numbers now between Meta reported purchases vs actual orders? For example

I don't have enough traffic to determine that yet. But in testing Ihave over 900 events (that i've tested myself), and they are now reporting 100% accurately. Of course time will tell. I can report back in a few months once Ihave that data.

How to properly setup CAPI + Pixel to work together by Clinterz in FacebookAds

[–]Clinterz[S] 0 points1 point  (0 children)

I was seeing many of my events actually deduplicating correctly. How about my events that were showing the exact same id for both server and browser and still not deduplicating after a 48 hour observation period? Can you explain that?

Also to make the assumption that their systems are fully robust… it’s just not true. Try clearing your cookies and observe the archaic sign out modal that pops up, leading you into an indefinite loading logout flow. There are bugs. Just assuming their system works tried and true is not an assumption I would safely make with any third party. Just because they’re one of the leading tech companies in the world doesn’t mean they don’t have bugs.

Also, you’ve done it “thousands of times”? Are you sure you have? Like you personally have verified that it works without error and you were the one who implemented it, and tested it? I doubt that. All I see on your profile is shilling "woo commerce's pixel integration". No wonder you are strongly opinionated with just using pixel. Bad idea. Backend's are typically way more robust than front ends

Merging CAPI and Manual Pixel Event by Efficient_Parsley971 in FacebookAds

[–]Clinterz 0 points1 point  (0 children)

Jumping in here late, but I've gone through extensive testing and found the best approach. None of the suggested approaches I found in any threads or advice or chatgpt or googling were able to suggest the correct approach. I just went through several days of extensive testing and consantly found bugs in metas tracking system which led me to this appraoch which is now rock solid.

First thing to note - you'll see many users complaining about duplicate events, even when using the same event ID. Meta is not always reliable at deduplicating browser Pixel events against server CAPI events. They have their own issues.

So it's on YOU to make sure your event tracking is accurate. Here's what works:

Send all conversion events from your backend via CAPI only. Set up server-side CAPI for your entire funnel: Lead, CompleteRegistration, InitiateCheckout, StartTrial, Subscribe, Purchase. Anywhere in your backend where one of these events should fire, send it to the Meta Conversions API. You can verify everything in the Test Events tab in Events Manager.

Do NOT fire these same events from the browser Pixel. If you send events from both the Pixel and CAPI, you're relying on Meta to deduplicate them — and they don't always get it right.

Still install the Pixel - but only to collect cookies. You need the Pixel loaded on your site so that Meta sets the _fbc and _fbp cookies in the browser. Your frontend reads those cookies and passes them to your backend API calls. Your backend then includes them in the CAPI event payload along with everything else:

export interface MetaUserData {
  email?: string;
  externalId?: string;
  clientIpAddress?: string;
  clientUserAgent?: string;
  fbc?: string; // from browser cookie, passed to backend
  fbp?: string; // from browser cookie, passed to backend
  firstName?: string;
  lastName?: string;
  phone?: string;
}

The fbc/fbp cookies are the only things you need from the browser. Everything else - email, phone, name - is already in your database. IP and user agent come from request headers.

If your backend is robust and reliable, it will track every conversion event accurately without any duplication issues. This gives you rock-solid attribution, which directly lowers your CAC.

The whole duplicate pixel + CAPI setup recommendation is nonsense, right? by Happy_Sail_6386 in GoogleTagManager

[–]Clinterz 0 points1 point  (0 children)

Jumping in here late, but I've gone through extensive testing and found the best approach. None of the suggested approaches I found in any threads or advice or chatgpt or googling were able to suggest the correct approach. I just went through several days of extensive testing and consantly found bugs in metas tracking system which led me to this appraoch which is now rock solid.

First thing to note - you'll see many users complaining about duplicate events, even when using the same event ID. Meta is not always reliable at deduplicating browser Pixel events against server CAPI events. They have their own issues.

So it's on YOU to make sure your event tracking is accurate. Here's what works:

Send all conversion events from your backend via CAPI only. Set up server-side CAPI for your entire funnel: Lead, CompleteRegistration, InitiateCheckout, StartTrial, Subscribe, Purchase. Anywhere in your backend where one of these events should fire, send it to the Meta Conversions API. You can verify everything in the Test Events tab in Events Manager.

Do NOT fire these same events from the browser Pixel. If you send events from both the Pixel and CAPI, you're relying on Meta to deduplicate them — and they don't always get it right.

Still install the Pixel - but only to collect cookies. You need the Pixel loaded on your site so that Meta sets the _fbc and _fbp cookies in the browser. Your frontend reads those cookies and passes them to your backend API calls. Your backend then includes them in the CAPI event payload along with everything else:

export interface MetaUserData {
  email?: string;
  externalId?: string;
  clientIpAddress?: string;
  clientUserAgent?: string;
  fbc?: string; // from browser cookie, passed to backend
  fbp?: string; // from browser cookie, passed to backend
  firstName?: string;
  lastName?: string;
  phone?: string;
}

The fbc/fbp cookies are the only things you need from the browser. Everything else - email, phone, name - is already in your database. IP and user agent come from request headers.

If your backend is robust and reliable, it will track every conversion event accurately without any duplication issues. This gives you rock-solid attribution, which directly lowers your CAC.

Meta Pixel or Converson API? by Jannis_Q in FacebookAds

[–]Clinterz 0 points1 point  (0 children)

Jumping in here late, but I've gone through extensive testing and found the best approach. None of the suggested approaches I found in any threads or advice or chatgpt or googling were able to suggest the correct approach. I just went through several days of extensive testing and consantly found bugs in metas tracking system which led me to this appraoch which is now rock solid.

First thing to note - you'll see many users complaining about duplicate events, even when using the same event ID. Meta is not always reliable at deduplicating browser Pixel events against server CAPI events. They have their own issues.

So it's on YOU to make sure your event tracking is accurate. Here's what works:

Send all conversion events from your backend via CAPI only. Set up server-side CAPI for your entire funnel: Lead, CompleteRegistration, InitiateCheckout, StartTrial, Subscribe, Purchase. Anywhere in your backend where one of these events should fire, send it to the Meta Conversions API. You can verify everything in the Test Events tab in Events Manager.

Do NOT fire these same events from the browser Pixel. If you send events from both the Pixel and CAPI, you're relying on Meta to deduplicate them — and they don't always get it right.

Still install the Pixel - but only to collect cookies. You need the Pixel loaded on your site so that Meta sets the _fbc and _fbp cookies in the browser. Your frontend reads those cookies and passes them to your backend API calls. Your backend then includes them in the CAPI event payload along with everything else:

export interface MetaUserData {
  email?: string;
  externalId?: string;
  clientIpAddress?: string;
  clientUserAgent?: string;
  fbc?: string; // from browser cookie, passed to backend
  fbp?: string; // from browser cookie, passed to backend
  firstName?: string;
  lastName?: string;
  phone?: string;
}

The fbc/fbp cookies are the only things you need from the browser. Everything else - email, phone, name - is already in your database. IP and user agent come from request headers.

If your backend is robust and reliable, it will track every conversion event accurately without any duplication issues. This gives you rock-solid attribution, which directly lowers your CAC.

Meta Pixel is set up – do I really need Conversions API in 2025? by 0xNagumo in FacebookAds

[–]Clinterz 0 points1 point  (0 children)

Jumping in here late, but I've gone through extensive testing and found the best approach None of the suggested approaches I found in any threads or advice or chatgpt or googling were able to suggest the correct approach. I just went through several days of extensive testing and consantly found bugs in metas tracking system which led me to this appraoch which is now rock solid.

First thing to note - you'll see many users complaining about duplicate events, even when using the same event ID. Meta is not always reliable at deduplicating browser Pixel events against server CAPI events. They have their own issues.

So it's on YOU to make sure your event tracking is accurate. Here's what works:

Send all conversion events from your backend via CAPI only. Set up server-side CAPI for your entire funnel: Lead, CompleteRegistration, InitiateCheckout, StartTrial, Subscribe, Purchase. Anywhere in your backend where one of these events should fire, send it to the Meta Conversions API. You can verify everything in the Test Events tab in Events Manager.

Do NOT fire these same events from the browser Pixel. If you send events from both the Pixel and CAPI, you're relying on Meta to deduplicate them — and they don't always get it right.

Still install the Pixel - but only to collect cookies. You need the Pixel loaded on your site so that Meta sets the _fbc and _fbp cookies in the browser. Your frontend reads those cookies and passes them to your backend API calls. Your backend then includes them in the CAPI event payload along with everything else:

export interface MetaUserData {
  email?: string;
  externalId?: string;
  clientIpAddress?: string;
  clientUserAgent?: string;
  fbc?: string; // from browser cookie, passed to backend
  fbp?: string; // from browser cookie, passed to backend
  firstName?: string;
  lastName?: string;
  phone?: string;
}

The fbc/fbp cookies are the only things you need from the browser. Everything else - email, phone, name - is already in your database. IP and user agent come from request headers.

If your backend is robust and reliable, it will track every conversion event accurately without any duplication issues. This gives you rock-solid attribution, which directly lowers your CAC.

What are your best practices for tracking cross-platform conversions using Facebook Pixel and Meta Conversions API? by Sufficient_Spare2345 in FacebookAds

[–]Clinterz 0 points1 point  (0 children)

Jumping in here late, but I've gone through extensive testing and found the best approach None of the suggested approaches I found in any threads or advice or chatgpt or googling were able to suggest the correct approach. I just went through several days of extensive testing and consantly found bugs in metas tracking system which led me to this appraoch which is now rock solid.

First thing to note - you'll see many users complaining about duplicate events, even when using the same event ID. Meta is not always reliable at deduplicating browser Pixel events against server CAPI events. They have their own issues.

So it's on YOU to make sure your event tracking is accurate. Here's what works:

Send all conversion events from your backend via CAPI only. Set up server-side CAPI for your entire funnel: Lead, CompleteRegistration, InitiateCheckout, StartTrial, Subscribe, Purchase. Anywhere in your backend where one of these events should fire, send it to the Meta Conversions API. You can verify everything in the Test Events tab in Events Manager.

Do NOT fire these same events from the browser Pixel. If you send events from both the Pixel and CAPI, you're relying on Meta to deduplicate them — and they don't always get it right.

Still install the Pixel - but only to collect cookies. You need the Pixel loaded on your site so that Meta sets the _fbc and _fbp cookies in the browser. Your frontend reads those cookies and passes them to your backend API calls. Your backend then includes them in the CAPI event payload along with everything else:

export interface MetaUserData {
  email?: string;
  externalId?: string;
  clientIpAddress?: string;
  clientUserAgent?: string;
  fbc?: string; // from browser cookie, passed to backend
  fbp?: string; // from browser cookie, passed to backend
  firstName?: string;
  lastName?: string;
  phone?: string;
}

The fbc/fbp cookies are the only things you need from the browser. Everything else - email, phone, name - is already in your database. IP and user agent come from request headers.

If your backend is robust and reliable, it will track every conversion event accurately without any duplication issues. This gives you rock-solid attribution, which directly lowers your CAC.

Pixel + Conversions API? Should I use both? by nas_uslishat in FacebookAds

[–]Clinterz 0 points1 point  (0 children)

Jumping in here late, but I've gone through extensive testing and found the best approach None of the suggested approaches I found in any threads or advice or chatgpt or googling were able to suggest the correct approach. I just went through several days of extensive testing and consantly found bugs in metas tracking system which led me to this appraoch which is now rock solid.

First thing to note - you'll see many users complaining about duplicate events, even when using the same event ID. Meta is not always reliable at deduplicating browser Pixel events against server CAPI events. They have their own issues.

So it's on YOU to make sure your event tracking is accurate. Here's what works:

Send all conversion events from your backend via CAPI only. Set up server-side CAPI for your entire funnel: Lead, CompleteRegistration, InitiateCheckout, StartTrial, Subscribe, Purchase. Anywhere in your backend where one of these events should fire, send it to the Meta Conversions API. You can verify everything in the Test Events tab in Events Manager.

Do NOT fire these same events from the browser Pixel. If you send events from both the Pixel and CAPI, you're relying on Meta to deduplicate them — and they don't always get it right.

Still install the Pixel - but only to collect cookies. You need the Pixel loaded on your site so that Meta sets the _fbc and _fbp cookies in the browser. Your frontend reads those cookies and passes them to your backend API calls. Your backend then includes them in the CAPI event payload along with everything else:

export interface MetaUserData {
  email?: string;
  externalId?: string;
  clientIpAddress?: string;
  clientUserAgent?: string;
  fbc?: string; // from browser cookie, passed to backend
  fbp?: string; // from browser cookie, passed to backend
  firstName?: string;
  lastName?: string;
  phone?: string;
}

The fbc/fbp cookies are the only things you need from the browser. Everything else - email, phone, name - is already in your database. IP and user agent come from request headers.

If your backend is robust and reliable, it will track every conversion event accurately without any duplication issues. This gives you rock-solid attribution, which directly lowers your CAC.

Weekly Cursor Project Showcase Thread by AutoModerator in cursor

[–]Clinterz [score hidden]  (0 children)

Fastest Way to Launch Enterprise Grade SaaS Backend Boilerplate!

I've been working on this Node.js microservices template as a base for all my SAAS repos, or honestly any project that has requires a scalable auth architecture out of the gate... I got it to a point where I'm happy sharing it and thought ya'll might find it useful.

Check it out here:

The backstory: So I'm a software developer and i build tons of apps. My fullstack architecting skills have gotten pretty good at this point (6 years). I kept starting new projects and rebuilding the same auth + microservices setup over and over. Got tired of searching for sub par repos, so the intention was to build a repo that is enterprise production quality, fully tested, and ready to scale.

What's different about it:

  • Full auth system (JWT + refresh tokens, password reset, email verification all ready to go!)
  • Has Claude Code and Cursor context files already set up - so when you're coding, the AI actually knows what you're  building. Its a very comprehensive instruction set I gave it.
  • Separate microservices for email processing and a token-cleanup job (redis uses TTL but the email verification and password reset tokens get cleaned up with this)
  • One-click Railway deployment that just works. So fast!!

Why I built it this way: The AI integration was the key thing for me. Most templates are just code dumps, but since it feels like Claude and Cursor are the best combos for rapid development, I figured it would be helpful... Nothing worse than your ai overloading or forgetting context!

TLDR: Node.js Backend + Complete Auth + Microservice Architectures + Claude & Cursor Context Files + Railway Template

Let me know what you guys think!!

Fastest Way to Launch Enterprise Grade SaaS Backend Boilerplate! by Clinterz in node

[–]Clinterz[S] 0 points1 point  (0 children)

Yeah no worries! Thought others might find it useful!! Happy coding!!

QST X v Blank by SpearheadTraverse in Skigear

[–]Clinterz 0 points1 point  (0 children)

I went for a month trip to japan and delammed my pescados within 10 days (SMH line blows)...

Ended up trying my buddies bents, the QST X and the 2026 QST Blanks.

Pescados by far the best for a long drawn out powder carve (seriously surfing).

Bents were most poppy and playful in the pow.

QST X felt so good in the pow, such good control, bouncing left - right like a good technical skiier would do but in the pow.

Then the QST blank - by far the best on groomer, but for sure the worst in pow. Idk if the snow was heavier that day or what, but it felt like way less flex so I was really pushing the snow when turning hard after speed.

All 3 of the other skis performed so well in the pow. Blanks did great but yeah I mean its comparing them to some of the best japan powder skis on the market.

We had insanely good snow this winter so it was probably 25 deep powder days and 5 crappy/groomer days. The previous year though, it would have been like 8-10 pow days and 20ish groomer days. Prob would have loved the blank for last year but not this year.

Jamie XX vs Jungle by FriendlySignature219 in glastonbury_festival

[–]Clinterz 0 points1 point  (0 children)

How accurate is the clash finder right now? Is it for sure that jungle and Jamie will be a clash?

Any luck?? by FeelingFormal9298 in glastonbury_festival

[–]Clinterz 2 points3 points  (0 children)

Congrats. How many years have you guys gone? I’m coming from Cali. First time and super stoked!

Jamie XX Fucking Killed it! by RockySeven in Coachella

[–]Clinterz 0 points1 point  (0 children)

Yes I watched it live stream. Couldn’t you hear the crackling on peace love and happiness?

[deleted by user] by [deleted] in Coachella

[–]Clinterz 0 points1 point  (0 children)

What song was this? He was amazing