External Data Integration Services, Explorium

Inbar Shirizly

Data Science Fellows June 2020 Cohort


The project goal was to create a stable and reliable signal out of websites, using “dynamic scraping” method. After creating the signals, the task was to integrate the code with the platform and test it. The last part was creating score for each output, the evaluates the accuracy of the response compared to the actual customer query.

Challenges (at least two)

  1. Understand how to integrate with a real-world problem. Deploying using Jenkins
  2. Evaluating the score of confidence of the result, using several logics

Achievements (according to KPIs)

  1. Productization  – Done
  2. Evaluation – Done partially.

Future project development 

The further development would be to create more sophisticated methods to evaluate the data and to create solid tasks the will check the data retrieved is stable

Share this post

Share on facebook
Share on twitter
Share on linkedin
Share on email