Data Cloud is here and it looks like it won’t go away. But how can we, as an architects, embed it into our solutions? Why and for what purposes?
One way how to find out is Trailhead and potentially the Data Cloud certification, which will give you one side of the view, the ultimate Salesforce view of all data in one system, duplicates which will be unified, search in unstructured data and plenty of other possibilities we weren’t used to.
Over time, Data Cloud is set to become a foundational layer that will facilitat the storage, processing, and activation of data across many organizational functions, including CRM, marketing, web, commerce, loyalty, and analytics.
Eliot’s book is the other view, I would say more cautious, and we will need to find where the future really is.
Pricing
Obviously one important part of ROI calculation and making decision how to use the system is the pricing, which for long time was very unclear. Not sure it is much better now, when the calculator, not run by Salesforce, has been introduced. The Trailhead module might help you as well, but the calculator is probably easier to run (thank you Isaas Shaffer for providing it to the community).
Hard to understand all the possibilities without further studies, but let’s pretend that I’m medium (in my view) sized customer, with 1M records to start with and small part of them will change on daily basis triggering the unification and insights calculation again. I won’t go with all those extra features such as streaming data, which will be more costly, or the unstructured data processing, maybe next step.
Still my calculation is 160 000 credits first year and roughly 50 000 the following years. Combined with the prices I see in my org (1000€ per months per 100 000 credits) it is pretty expensive fun and I bought just the minimum I can. Actually the free Data Cloud comes with 250 000 credits so I’m probably fine and should not worry.
With these prices in mind I would follow Eliot’s recommendation to limit number of data I want to push into Data Cloud and push only data which makes sense for the current business needs. I understand that Agentforce might be clever enough and make something great from all the data I might have in the organization, but right now I would limit to business to specify their actually needs and push only relevant data. And it isn’t just about related data to customers, but also about excluding of inactive customers. Why to unify/harmonize them if they don’t do business with you anymore?
Also segmentation is – according to Eliot – usually the largest area of credit consumption, hence the segments should be designed with a goal to minimize the amount of rows processed. (I didn’t find these credits in the calculator)
Other notes from the book
I did know that the Data Cloud is build on top of AWS, but seeing the list of technologies at one place is incredible – Amazon S3, Parquet files, Elastic Kubernetes Service, Sync, Cloudwatch, AWS Identity Access Management, AWS Auto Scaling, ElastiCache, Simple Queue Service, Elastic MapReduce, Spark, DynamoDB, Relational Database Service.
Think about batch versus real-time data processing, especially if your data are rather changed on daily basis. Also if you have ETL in your landscape, you might want to use that system to tranform your data rather than leaving it to Data Cloud and related credit consumption – I mean transform 100 000 rows uses just 40 credits so who care, but maybe …
Things to keep in mind – categories cannot be change and are inherited from the first DLO mapped to DMO, you need to select immutable data field for the Engagement category otherwise you risk duplicated data, person accounts cannot be used in identity resolution as they contain a mixture of account and contact fields (what??!!), the original intent of „party“ was to provide grouping of individuals (e.g. household, organizational division) but right now it refers either to individual or account. You can trigger segment refresh via flow, this way they can refresh only when really needed and not in regular cycle.
Timing
Timing of the Data Cloud project is crucial and before you will do a few of them you have no clue how to estimate it. Again, according to Eliot, the discovery, planning and design architecture phase typically amoount to 70 % of the whole implementation. Which shows how easy it is to do the „real“ implementation and how important part in the project the experienced solution architect plays.
The simple rule when approaching discovery for a Data Cloud implementation is to start with the end in mind. Begin by understanding the business challenges and use cases.
CDP or not CDP?
Data Cloud arrived after Genii, which arrived after CDP so you might want to say that Data Cloud is kind of CDP, maybe on steroids, which Salesforce doesn’t really like as it is much more. Like what? Eliot’s take is that traditional CDP are specifically used for marketing use while Data Cloud allows you to create segments according to business needs such as:
- sales team to pre-qualify leads based on their engagement behavior;
- service teams to prioritize cases based on customer profiles;
- analysts to group individuals based on demographics, psychographic, behavioral and geographis data;
- IT teas to understand use across devices;
- finance team to analyze economic values of specific customer groups;
- etc.
Looks to me almost like an analytical tool.
Consent model
I’m gilty – it has been introduced in 2018 and extended later on and I never really paid attention to those extra objects as I always felt that while the structure is powerful my customers don’t have such extended needs. And Data Cloud build on this structure to fully support the needs. Actually did you know there are 23 objects and relevant DMOs for it? There are actually 140 DMOs (and counting) in total in the comprehensive model of Data Cloud aka Customer 360 Data Model.
A well architected solution takes time and comes at a cost, but if you think good architecture is expensive, try bad architecture.
That was the final sentence of this great, roughly 140 pages long, book full of information and things for thought. And I would say if you think about implementing Data Cloud you should definitely add it to your library.
Like this:
Like Loading...
Napiš komentář, díky!