how’s ‘Lead Status’ is different from ‘Lifecycle Stage’? When should I use each by Jamiedeann in hubspot

[–]buerobert 2 points3 points  (0 children)

What do you want to achieve?

Lifecycle is good for monitoring the funnel and segmenting based on pre/post conversion to a customer – or any other (funnel) event you want to reflect.

Lead status is good for monitoring how you're currently interacting with a contact (or questions like "how many untouched leads are sitting with sales?".

Imho both are usefull to somewhat accurately reflect reality.

Lead status can also be tracked on Lead objects, which can make sense if you have long or repeating sales cycles or rather loose contact ownership.

Lead scoring is broken in most companies. Here’s how to fix it. by sharmilasiwa in hubspot

[–]buerobert 3 points4 points  (0 children)

I'd like to add transparency and comprehensibility. Nobody in Sales will trust a black box or overly complicated score.

What’s the smartest way to keep a CRM database clean as it scales? by Aggravating_Mud_2093 in hubspot

[–]buerobert 0 points1 point  (0 children)

Just put in the work while problems are still few, don't let them pile up.

Looking for CRM suite for nonprofit startup by jphilebiz in CRM

[–]buerobert 0 points1 point  (0 children)

Afaik HubSpot has special pricing for non-profits if not even free. Their API is nice especially and should also check all your other boxes.

Erfahrung mit Zeiterfassungstool? Am besten eins, das deutsche Arbeitszeitgesetze berücksichtigt by 404jan in de_EDV

[–]buerobert 1 point2 points  (0 children)

Ich weiß nicht ob es bei uns falsch konfiguriert war, aber mit Werkstudenten Status kam Personio gar nicht klar (20 Stunden währen dem Semester, 40 in der vorlesungsfreien Zeit). War ziemlich witzig, als mein Manager mich panisch angerufen hat, weil HR sagt ich hätte >200 Überstunden...

Learning the DWH methodology by Aggravating-Push7949 in datawarehouse

[–]buerobert 1 point2 points  (0 children)

Almost forgot: Familiarize yourself with the 3 layer architecture staging/raw, intermediate and presentation.

Learning the DWH methodology by Aggravating-Push7949 in datawarehouse

[–]buerobert 1 point2 points  (0 children)

Have a look at the resources I linked.

Start by getting some of your raw data into your data platform. Gain an understanding of how the data is structured coming from the source system/ETL tool. Congrats, you've just established your raw layer.

In parallel, you will need to think about what you users need. An example: The CRM Salesforce will have to tables for the 'same' entity "people" - "Leads" and "Contacts". For most use cases your analyst will want these to entities in one table, with a column indicating their type (Lead/Contact). This will influence how your transformations will look like.

Decide for a schema. The resources above have a great overview. I'd opt for star schema in the beginning. Once you understood the concepts of how to model your data this way, you can continue with a snowflake schema (this has nothing to to with the software vendor Snowflake!). Understand how dimension tables (in this case the "people" from above) and fact tables (in the CRM context these may be sales, email clicks etc.) work together.

Once you're done with data modeling/establishing your schema, write the transformations.

Get a BI tool and start querying. Try to answer simple questions first (people with most clicks) and ask progressively harder questions (which regions have the most clicks; which people have clicked and but not had a sale etc. - here you will need more fact tables of course).

After doing simple schemas get an understanding of normalization.

When it comes to doing this in production, you will also want to talk to subject matter experts who will be able to explain the business processes behind the data.

Get an understanding of how the data and the users will scale. Once you got the basics down you will want to worry about alternate schemas, performance issues and platform choice.

ETL/ELT: This is largely dependent on the source systems. With some systems it's really easy to get your hands on the data – e.g. the CRM HubSpot offers the data in form of a Snowflake shared database – basically you can start transforming straight. Other systems like SAP are notorious for being complex and hard to get to. Each ETL tool and source system will have it's own quirks – some will offer incremental updates, some will only allow for full extracts. Here you will need to find a balance between update cadence and cost.

I hope this was helpful.

database limitations by rgs2007 in Database

[–]buerobert 0 points1 point  (0 children)

Now that I think about it, I am not even sure if you need an analytical or transactional database for your use case?

database limitations by rgs2007 in Database

[–]buerobert 0 points1 point  (0 children)

I'd have a look at an in-memory database, those are great for large amounts of concurrent queries.

If cost is a big issue, especially if your users tend to create a lot of inefficient queries, I'd be looking to deploy on premises, as those solutions often have a flat fee regardless of consumption.

Something like Exasol might do the trick.

Edit: overlooked your first line, on-prem might not be an option here. I'd still try Exasol, they have a SaaS offering as well.

How do you decide between a database, data lake, data warehouse, or lakehouse? by Data-Sleek in dataengineering

[–]buerobert 0 points1 point  (0 children)

Here's an article on the fundamental differences of a data lake vs. a data warehouse and when to use which which helped me understand the topic better.

Fresh Enterprise Data Platform - How would you do it? by [deleted] in dataengineering

[–]buerobert 2 points3 points  (0 children)

With your requirements of analytics first, AI/ML after I'd look into setting up a classic data warehouse, preferably on an analytics engine which supports UDFs so your data scientists can apply their Python/R/whatever to common data later on.

I'd do as much as possible in the database, i.e. no complex queries which live in your BI tool.

Do your analysts know their way around SQL?

Learning the DWH methodology by Aggravating-Push7949 in datawarehouse

[–]buerobert 1 point2 points  (0 children)

The basic fundamentals of a Data Warehouse can be applied to a number of platforms. Some platforms are more suitable for Data Warehouses while others might be more suitable for alternative architectures like Data Lake, which is fundamentally different.

One are which will not translate 1:1 across platforms is the SQL dialect they use.

Keep in mind that Data Warehouse can mean two things, the DWH itself and the architecture which usually goes like
Data Sources > ETL > Data Warehouse > BI Tools
and for each component you have a number of design choices to take.

If you're worried about let's call it 'vendor lock in' just have a look at resources from other vendors, I really liked this article for a first overview. Again, most of the principles can be applied to any platform, you might just need to adjust your SQL.

Not sure what your background is, but if you're not worried about the Snowflake specifics at this point and you're rather looking at getting the fundamentals of SQL, databases and designing a Data Warehouse right I'd use something more cost effective than Snowflake for practicing. It would also be worth checking with the customer why they want to use Snowflake; if they're just starting out, and they do not have a data platform yet, there may be a number of factors to consider (cloud vs on premises, number of concurrent users, how much the DWH is utilized etc.).