What data challenge will you tackle this new year? One year working with partners at Snowflake

With the new year comes the retrospective. This is also the opportunity to give some visibility on what partner sales engineer role means, we have currently 2 positions open in Europe (UK and Germany) and a lot more to come ! Don't hesitate to contact me to know more.



2021, the year of Data Mesh 

Data Mesh has been a trending architecture topic for several months. It's now in most of the discussions we have with large accounts (with good data maturity).

Of course it resonates very well with Snowflake Data Cloud and the ability to seamlessly share data  between Snowflake accounts ("Domains") wherever they are but also with all elastic and scalable resources that can be allocated to these domains.

Domain centric architecture
Actually, a year ago and before I knew about Data Mesh architecture principles, I wrote a blog  "Domain centric architecture : Data driven business process powered by Snowflake Data Sharing".  What a mistake! I should have call it "Data Mesh, something..." for SEO if I knew it. I remember looking for a good name and Domain Centric Architecture was the best fit.



Since, Zhamak is the director of emerging technologies at Thoughtworks made a great job promoting their architecture principles first published in 2019.

So what happen next on my side. As a GSI sales engineer team, our role our role is to educate our partners on Snowflake but also share some thought leadership , connecting market trends with Snowflake capabilities. So I worked on Data Mesh presentations for partners, probably presented more than 10 times in Europe and also delivered a dedicated session during Big Data Paris with my partner in crime Ivan Smets (VP Sales Enterprise France) 


Data Mesh session at Big Data Paris 2021

Ramping up Snowflake practices
and building communities with partners

I joined Snowflake one year and half ago already. Everything is moving so fast that I feel it was 3 years ago. One of the first thing we tried to do when I joined was to create connections with Partners, first in France and now in Europe with my Europe GSI role.
What is the impact ? 650+, this is the number of participants to the Snowflake Rendez-Vous in France since we started. We now have 1 session per week, from introduction, to deep dive, to focus topics to selling Snowflake.
For GSI, the scale is totally different. Each member of the team in Europe has this level of reach but per month.

The upcoming sessions in french are available here : https://forms.gle/t3W8dv3nziv1LbhbA 

Data Clean Room, combine data without sharing PII/raw with one another

Under the pressure of increased privacy regulation in the marketing world, many customers are becoming interested in the concept of data clean rooms. A data clean room is a safe place that allows multiple companies, or divisions of a single company, to bring data together for joint analysis under defined guidelines and restrictions that keep the data secure.
Data Clean Room was another trending topic in 2021 (with the end of the third-party cookies).

Clean rooms can also find multiple use cases, for 2 parties (let's say a retail bank and an insurer) to combine their customer insights without exposing PII/raw information - data is the new gold. We can expect lot of discussions with GSI and customers about Data Clean Room built on Snowflake in 2022.


One Global Team

Working with GSI is also the opportunity to work with an incredible team distributed all around the world.

In order to deliver the most accurate messages to our partner we leverage talents from multiple different SME and most inspiring people in tech. Just to name a few (from each continent), meet Eda who can talk about any advanced topic with such a precision and enthusiasm, Carlos who never stops producing amazing contents, Santosh and his 21+ years of experience helping GSI with complex migration scenarios but also experts from our CTO office delivering focus workshops on Security, Data Governance or SAP to Snowflake scenarios.


BOM, BOM, BOM,
my dear manufacturing use cases

I've spent most of my time during the previous years working on aircraft, train and car manufacturer use cases. So first idea with Snowflake was of course to make some tests and illustrate how Snowflake could be leveraged in the engineering context.
Merging BOM on the fly with Data Sharing

Last year I published a blog post on BOM and Data Sharing. Actually the article is talking about Data Sharing to seamlessly merge BOMs from multiple parties which is a scenario I saw in the past but it's probably even more interesting for the hierarchy processing logic with the metrics given at the end of the post: 

The dataset contains:
  • 500 generated products (large aircraft like)
  • 18 millions links split between main site and the 2 component related sites
  • Performances on a Large virtual warehouse (Snowflake can go up to 4X-Large...):
Snowflake generates the 13.4 Millions unique BOM paths from the 500 products in 10.22s
Snowflake calculates the 875K aggregates to get average cost  per product and part reference, in 7.18s

And this is blazing fast compared to anything I've done and seen before (even from large Big Data hyper specialised vendor)

Building solutions with GSI

Working with GSI means also lot of solutions: in Retail, Supply Chain, Automotive, Financial Services, Data Governance. It means exploring and crunching industry reports, identifying use cases, creating solution architecture and pitch deck.

It's also working with very talented industry specialists, technology partners and create assets, demos and deliver webinar. Below a webinar delivered by OCTO (part of Accenture) in french.




A PS4, A Formula 1 game but also Kafka and Snowflake,
Data having fun again

I didn't have much time to work on demos this year, so let's come back to this one. Just take a racing wheel, a console (or a PC) and a racing simulator. These are the ingredients to generate IoT data, send them to Kafka and then leverage Kafka connector for Snowflake to ingest continuously car telemetry into Snowflake and produce KPI and nice visualisations.
IoT scenario with a PS4 and a racing wheel

So many things to learn : O'Reilly on-demand

In the past few years I spent a lot of time with Coursera with Stanford Data Science and Deep learning SpecializationTensorflow in Practice Specialization, Product Management and Digital Transformation courses. 
But wait, I totally missed O'Reilly on demand and only discovered the service this year. Ok it's quite expensive, consider 40€ per month (actually almost the price of a coursera paid certificate per month) but so many things to learn, a bit old school with books but also webinars available on-demand. One regret: my year was so busy that I didn't leverage it as I could for now.





At Snowflake we now also have access to Udemy for Business, 
I will probably need shorter nights to make the most of all this content !

Have a safe and happy holiday !

and Happy New Year 2022 !



Popular posts from this blog

Domain centric architecture : Data driven business process powered by Snowflake Data Sharing

Snowflake Data sharing is a game changer : Be ready to connect the dots (with a click)

Process XML, JSON and other sources with XQuery at scale in the Snowflake Data Cloud