From Scrapify
to Apify.
12 october 2021
Daltix and Apify outline the story of a calculated risk that turned out to be a big payoff.
From Scrapify to Apify: how Daltix saved 90% on webscraping costs.
This success story could be called “Why and how you should move your scrapers from Python to JavaScript”, “How to reduce your Amazon EC2 costs by 92% and save your team 4 hours a day of work”, or “How to migrate your scrapers using Apify SDK”. But whatever the best title for it would be, you should know this success story is really about taking calculated risks, having a team that’s willing to relearn and reinvent, and being adaptive in a fast-paced market environment. Let’s see how a company obsessed with providing quality data to enterprises active in retail managed to make that big move from Scrapy to Apify, from Python to JavaScript – all within one year – and what the payoffs are from their decision.
Read the full case study (in English) here.
The combination of Javascript, Node and Apify gave us all the pieces we needed to address every one of our existing challenges, futureproof our platform and resume scaling up our web collection activities.
- Share this article
Related resources.

Daltix introduces it’s Data Quality Indicators to ensure data transparency
If your data provider doesn’t give you a clear outline of how they test their own data for quality, you should get suspicious. In order to deliver data you can trust, we’ve developed the Daltix Data Quality Indicators. Read about what our DQIs are and why it’s good for your data here!

Download the Daltix Summary & Data Span Matrix
As we are expanding across Europe, here is a downloadable matrix on what data and features Daltix offers as well as what countries we collect in and for what retailers.

Daltix Data Architecture: Data Access
In the final blog post of the data architecture series, we’ll take a look at how Daltix can provide everyone who requires their data access in a way that suits them. To answer that, we’ll examine our two data setups and explain our experiences and how they fit into our data architecture.