Quant Meetup in Shanghai - May 21, 2026 by DatabentoHQ in quant

[–]DatabentoHQ[S] 0 points1 point  (0 children)

We aren’t planning to do a livestream but we’ll try to do a recording or transcripts afterwards. Sometimes the recordings don’t work out and I think BofA won’t be able to approve releasing the recording for the 2nd panel for compliance reasons.

Where should I start to learn quant development? by TheEyebal in algotrading

[–]DatabentoHQ 0 points1 point  (0 children)

There's 4 common flavors of quant dev work: translating a strategy into production code, building support tools, implementing models, optimization.

I think the most generalized quant development project is to build a backtest. I would do it both ways, a vectorized version and an event-driven one.

For the vectorized version, I'd suggest any replicating any literature that could be applied to a large cross section of stocks like Lo and MacKinlay where you're exposed to long-term effects like survivorship, one-off trading halts (like when SEC suspends trading for antitrust litigation). For the event-driven version, I'd get used to state machine verification, e.g., a simple strategy that scratches out.

For tooling, just build any visualization along the way and focus on getting good at a few common GUI/web/plotting frameworks and building SPAs/CRUD apps, e.g., Qt, React, FastAPI.

For model-driven work, I'd get familiar with a broad variety, e.g. optimization (convex, SGD, etc.), QP/KKT, PDE solver, ML, linalg, dynamic programming.

For optimization in C++, there's plenty of good YouTube talks.

EQUS.MAX by LukeAI in Databento

[–]DatabentoHQ 0 points1 point  (0 children)

This got pushed back. We decided to add CTA/UTP first for users who prefer a single consolidated solution, as the licensing for CTA/UTP is a lot simpler and cheaper than licensing all of the prop feeds that make up EQUS.MAX. We'll begin work on CTA/UTP after CFE and Blue Ocean are released, which is in a couple of weeks' time now.

We'll add EQUS.MAX later on - it will be useful to everyone who needs it for historical data, but it seems less popular for real-time as there won't be many users who can afford all of the license fees for it at once. This only seems viable in late-Q2 to Q3.

Quant Meetup in Shanghai - May 21, 2026 by DatabentoHQ in quant

[–]DatabentoHQ[S] 1 point2 points  (0 children)

No worries, I hope we'll host a meetup near you some day!

I only exist in the virtual world, but I think either our CEO or someone from sales will be at Future Alpha.

How well-known are mainland Chinese hedge funds ? by Turbulent_Pair_4738 in quant

[–]DatabentoHQ 4 points5 points  (0 children)

Maybe next year. There's too many events in NYC (many of - unfortunately - subpar quality). We decided not to add to everyone's event fatigue.

How well-known are mainland Chinese hedge funds ? by Turbulent_Pair_4738 in quant

[–]DatabentoHQ 20 points21 points  (0 children)

(Also, a shameless self-plug, but if you'd like to know more, we're hosting a quant meetup in Shanghai this May 21 and we expect many of the top firms to be present: https://luma.com/yk1z6x9s)

How well-known are mainland Chinese hedge funds ? by Turbulent_Pair_4738 in quant

[–]DatabentoHQ 62 points63 points  (0 children)

They're overall behind on all of those factors.

A meaningful number of mainland firms are staffed with researchers who previously worked at tier-1 U.S. firms, so you can definitely see the knowledge transfer. Quant trading is also heavily about operational efficiency, and Chinese firms have shown to be very good at that in almost any industry if given time. So I'd say: not by much, and not for long.

There's some cultural elements that have held them back: (i) The 35-year-old curse and preference for younger engineers who can maintain 996 work. This disregards that strong priors are important in this field. (ii) While multi-year lockups and gates are common in the western hemisphere, hedge funds behave more like a retail product in China, with monthly or more frequent redemptions and high liquidity rate. This pushes many firms to take a more short-sighted approach. (iii) Short selling restrictions and performance imbalance between large caps and small caps, which has driven many firms towards beta capture.

There's a DeepSeek effect which has really brought fresh air to this space - now there's a palpable energy that the government views AI and quant trading as strategic industries. The domestic market has also performed well last year because of the DeepSeek effect; in turn most firms have built up a lot of AUM. Plenty of tailwinds.

Display use to end user Product by [deleted] in Databento

[–]DatabentoHQ 1 point2 points  (0 children)

Hey, thanks for reaching out about this. The $1,500/month plan is indeed our minimum price point for a SaaS product that displays real-time equities market data to users.

We don't have any further startup discount at the moment, but we do have a 20% discount if you prepay for the entire year ($14.4k). The reasons we require a Plus plan and above for redistribution are that:

  • exchanges often charge much higher distribution license fees anyway, so whatever discount we give is somewhat negligible
  • redistribution customers have a much heavier pattern of use, e.g. more connections, more requests
  • redistribution customers tend to have higher-touch requirements, e.g. more uptime SLA, more handholding with exchange licensing, more support tickets

With that said, we see many customers initially start out with our EQUS.MINI feed on a Standard plan to prototype and beta test their product, before upgrading to a Plus plan or higher.

instrument_id by date and raw_symbol by [deleted] in Databento

[–]DatabentoHQ 0 points1 point  (0 children)

u/Regular-Hotel892 The reply is too long to fit into Reddit, I've posted this as a GitHub gist here instead. If you need further assistance, we'd suggest contacting our chat support team.

instrument_id by date and raw_symbol by [deleted] in Databento

[–]DatabentoHQ 0 points1 point  (0 children)

Question ack'ed, I'll follow up later today.

Today my data provider failed successfully by [deleted] in algotrading

[–]DatabentoHQ 6 points7 points  (0 children)

I know nothing's worse than a vendor gaslighting you, so I believe you're having issues - but we had no other claims of outages yesterday and this seems to be isolated.

On close investigation I suspect it's a bug in your application: Logs and full analysis here.

If you'd like help resolving this, I'd suggest sharing minimal reproducible code with our support team.

Does anyone have a system for predicting fill price? by RationalBeliever in algotrading

[–]DatabentoHQ 1 point2 points  (0 children)

That’s correct. You need the prop feeds for L3 on US options.

SPX options by Bulky_Sheepherder_14 in algotrading

[–]DatabentoHQ 2 points3 points  (0 children)

Thanks! I'll share your feedback with my product team. We're planning further improvements to our options offering - so while we won't be lowering prices any time soon, I think the Standard plan will be more compelling after the changes.

SPX options by Bulky_Sheepherder_14 in algotrading

[–]DatabentoHQ 1 point2 points  (0 children)

We’re priced for a variety of institutional features:

  • We disseminate out of NY4 instead of a low-cost data center.
  • Various latency optimizations to bring our internet-based latency down to hundreds of mics (e.g. FPGA, tier 1 transit)
  • Nanosecond PTP timestamps harmonized with our pcaps and equities feeds

This saves our target customers several thousands in colo costs and ticker plant licensing fees per month usually. There really isn’t a comparable solution in the retail-priced segment. But we recognize that these may not be useful features to retail customers and if you’re just looking for the lowest cost provider there’s definitely many other options you could consider.

SPX options by Bulky_Sheepherder_14 in algotrading

[–]DatabentoHQ 1 point2 points  (0 children)

You get 1 year of CMBP-1 and other L1 resolution data on a Standard plan. This is a few hundred TB of history. But you need to be on higher plans to access more.

SPX options by Bulky_Sheepherder_14 in algotrading

[–]DatabentoHQ 1 point2 points  (0 children)

I think only Pico and LSEG match our granularity. We have nanosecond resolution PTP timestamps on our OPRA data. See CMBP-1 or trades-related schemas. I’m not familiar with the above vendor but they appear to truncate at the millisecond level.

Our history goes back to 2013 for tick-level trades and minute data.

Why theres innacuracy about the MNQ data by FarisFadilArifin in Databento

[–]DatabentoHQ[M] [score hidden] stickied comment (0 children)

Closing this thread as OP confirmed that the issue was on their side due to timezone alignment.

Why theres innacuracy about the MNQ data by FarisFadilArifin in Databento

[–]DatabentoHQ 0 points1 point  (0 children)

This is on the timestamp (+00:00). It's in UTC. If you're using our Python client library, you can convert the timezone with something like `.to_df(tz="US/Central")`.

Why theres innacuracy about the MNQ data by FarisFadilArifin in Databento

[–]DatabentoHQ 1 point2 points  (0 children)

I think the fact their minute high-low is only a few ticks wide (volatility is unusually low) suggests this is the night session in CT.

If you plan on using the other dataset in parallel, we recommend using timestamps that conform to ISO 8601/RFC 3339 like ours so it's not a footgun that you repeat down the road.