What data challenge will you tackle this new year? 2018, one (more) year of Solution design at MarkLogic
With the new year, it's time for a retrospective of the past year and prepare for 2019.
First, 2018 was the year of GDPR (The EU General Data Protection Regulation). GDPR had been in the agenda of global companies and consulting firms for at least 2 years for being the most important change in data privacy regulation in 20 years. From the MarkLogic perspective GDPR, is a great opportunity to demonstrate the value of an operation DataHub to store data and metadata together to support robust data governance required by the regulation.Concrete uses cases deal for example with consent management in customer 360 hub or tackling right to be forgotten by managing data lineage from golden record to source dataset records.
In 2018, I also be involved in an observe-the-business SmartDataHub project for a client where we massively leveraged MarkLogic Optic API to perform aggregates on ERP data in order to create KPIs and dashboards on parts purchases and optimise supplier portfolio. I should write someday a post on this story.
2018 was also for me a year of continuity, continuity in Manufacturing (compared to 2017) but most of all, in Digital Continuity which actually matches business challenges not only from Manufacturing but also from all the other industries :
And from a hobby perspective,
For me 2018, is the year we'll say, "The personal computer is dead, long live the personal computing!", another post from the beginning of the year. Shadow released it's personal computer hosted in the cloud, Cocorico, a french startup with an amazing piece of technology.
And from a learning perspective,
I've spent a bunch of hours on Deep Learning Specialization from Andrew Ng, great learning materials with theory and practice available on Coursera and I finished the year with AWS Solutions Architect - Associate - Qwiklabs Quest.
What's next ?
First, 2018 was the year of GDPR (The EU General Data Protection Regulation). GDPR had been in the agenda of global companies and consulting firms for at least 2 years for being the most important change in data privacy regulation in 20 years. From the MarkLogic perspective GDPR, is a great opportunity to demonstrate the value of an operation DataHub to store data and metadata together to support robust data governance required by the regulation.Concrete uses cases deal for example with consent management in customer 360 hub or tackling right to be forgotten by managing data lineage from golden record to source dataset records.
- A detailed illustration is described is this blog post : How a Data Hub can help to find back memory to tackle GDPR right to be forgotten.
- And a more generic illustration is details here : Metadata in operational systems to enforce compliance (GDPR is not far away)
In 2018, I also be involved in an observe-the-business SmartDataHub project for a client where we massively leveraged MarkLogic Optic API to perform aggregates on ERP data in order to create KPIs and dashboards on parts purchases and optimise supplier portfolio. I should write someday a post on this story.
2018 was also for me a year of continuity, continuity in Manufacturing (compared to 2017) but most of all, in Digital Continuity which actually matches business challenges not only from Manufacturing but also from all the other industries :
- We can start with the 2 parts blog on "Semantics to increase synergies and move towards the Industry 4.0 Digital Thread".
Happy new year !
- Lineage and especially parts lineage is a key enabler in Manufacturing to leverage collected data and make the Digital Thread happen. ("Article reference matching, enabler of a manufacturer data strategy")
- Then you can apply previous concepts to the documentation management in order to personalise product documentation based on product attributes ("Personalised product Technical Documentation made simple !")
Temporal pattern |
- During the year I also spent some time improving the design of the Product Testing Datahub for Certification/Safety/Fraud management - Time meets semantics. The objective of the solution is to identify period of time matching a specific temporal pattern (and geospatial, structured and unstructured data conditions) in "realtime" in order for the engineers to know where/when they must look at in details. The solution makes the most of MarkLogic capabilities to deliver a service which is quite unique regarding overall workload spent on it (few tens of man days).
And from a hobby perspective,
For me 2018, is the year we'll say, "The personal computer is dead, long live the personal computing!", another post from the beginning of the year. Shadow released it's personal computer hosted in the cloud, Cocorico, a french startup with an amazing piece of technology.
- GTX 1080 specs
- Xeon with 8 threads
- 1 Gb/download
- in the cloud... UI being streamed to the local screen
...everything needed to play but also use TensorFlow on GPU or Unity3D with high performances.
And from a learning perspective,
I've spent a bunch of hours on Deep Learning Specialization from Andrew Ng, great learning materials with theory and practice available on Coursera and I finished the year with AWS Solutions Architect - Associate - Qwiklabs Quest.
What's next ?
- The year will start will some new courses from https://deeplearning.mit.edu.
- Cloud and vertical solutions will be my focus for the year
- and ML
- MarkLogic
- MachineLearning
- And more concepts and PoC...