What data challenge will you tackle this new year? 2018, one (more) year of Solution design at MarkLogic

With the new year, it's time for a retrospective of the past year and prepare for 2019.




First, 2018 was the year of GDPR (The EU General Data Protection Regulation). GDPR had been in the agenda of global companies and consulting firms for at least 2 years for being the most important change in data privacy regulation in 20 years. From the MarkLogic perspective GDPR, is a great opportunity to demonstrate the value of an operation DataHub to store data and metadata together to support robust data governance required by the regulation.Concrete uses cases deal for example with consent management in customer 360 hub or tackling right to be forgotten by managing data lineage from golden record to source dataset records. 

In 2018, I also be involved in an observe-the-business SmartDataHub project for a client where we massively leveraged MarkLogic Optic API to perform aggregates on ERP data in order to create KPIs and dashboards on parts purchases and optimise supplier portfolio. I should write someday a post on this story.

2018 was also for me a year of continuity, continuity in Manufacturing (compared to 2017) but most of all, in Digital Continuity which actually matches business challenges not only from Manufacturing but also from all the other industries :

  • We can start with the 2 parts blog on "Semantics to increase synergies and move towards the Industry 4.0 Digital Thread".
    Happy new year !
    • Part 1 : Mainly on Effectivity management
    • Part 2: Product structure management and applied effectivity 




Temporal pattern
  • During the year I also spent some time improving the design of the Product  Testing Datahub for Certification/Safety/Fraud management - Time meets semantics. The objective of the solution is to identify period of time matching a specific temporal pattern (and geospatial, structured and unstructured data conditions) in "realtime" in order for the engineers to know where/when they must look at in details. The solution makes the most of MarkLogic capabilities to deliver a service which is quite unique regarding overall workload spent on it (few tens of man days).
These past 3 years spent working with new clients and partners in Manufacturing (mainly in automotive, aeronautics and transportations) were the opportunity to better understand and apply how MarkLogic and data can make Digital continuity happen. I tried to summarise it in my last post of the year which is probably the consolidation of what I learnt and experiment during these years : Welcome to the (parallel) dimensions of the Digital Continuity.


And from a hobby perspective,

For me 2018, is the year we'll say, "The personal computer is dead, long live the personal computing!", another post from the beginning of the year. Shadow released it's personal computer hosted in the cloud, Cocorico, a french startup with an amazing piece of technology. 

  • GTX 1080 specs
  • Xeon with 8 threads
  • 1 Gb/download
  • in the cloud... UI being streamed to the local screen

...everything needed to play but also use TensorFlow on GPU or Unity3D with high performances.


And from a learning perspective, 

I've spent a bunch of hours on Deep Learning Specialization from Andrew Ng, great learning materials with theory and practice available on Coursera and I finished the year with AWS Solutions Architect - Associate - Qwiklabs Quest.


What's next ?

  • The year will start will some new courses from https://deeplearning.mit.edu.
  • Cloud and vertical solutions will be my focus for the year
  • and ML 
    • MarkLogic
    • MachineLearning
  • And more concepts and PoC...




Popular posts from this blog

Domain centric architecture : Data driven business process powered by Snowflake Data Sharing

Snowflake Data sharing is a game changer : Be ready to connect the dots (with a click)

Process XML, JSON and other sources with XQuery at scale in the Snowflake Data Cloud