The challenge posed to our team was to develop a resource light scraping solution for publicly available websites. The challenge was to gather, translate and relay data from an extremely high volume of events from multiple sources with the lowest possible latency as they feed the risk management engines that are used by our client’s customers.
Whilst our client employs over 400 developers, they recognised that EC have deep domain and technical expertise, and that this was an extremely complex project given the sources that are being scraped have put in place highly sophisticated mechanisms to prevent robotic activity.
Our first engagement was to deliver a POC that validated that this was technically possible. The next phase was to architect a resilient and scalable platform that can scrape, verify, and translate thousands of events per second in real-time.
We have subsequently been engaged on a long-term basis for several years to build and continuously improve the platform. We have worked closely with our client on some of their own systems and have also engaged with other sectors of their business due to the trust and reputation that we have established.
Whilst we continue to work on this system and integrate it with our customer, it is now being used by the majority of their clients, meaning we have been engaged to provide 24/7 support as well as an ongoing long-term build of new features and improvements.