From Scrapify
to Apify.

12 october 2021

Daltix and Apify outline the story of a calculated risk that turned out to be a big payoff.

From Scrapify to Apify: how Daltix saved 90% on webscraping costs.

This success story could be called “Why and how you should move your scrapers from Python to JavaScript”, “How to reduce your Amazon EC2 costs by 92% and save your team 4 hours a day of work”, or “How to migrate your scrapers using Apify SDK”. But whatever the best title for it would be, you should know this success story is really about taking calculated risks, having a team that’s willing to relearn and reinvent, and being adaptive in a fast-paced market environment. Let’s see how a company obsessed with providing quality data to enterprises active in retail managed to make that big move from Scrapy to Apify, from Python to JavaScript – all within one year – and what the payoffs are from their decision.

Read the full case study (in English) here.


The combination of Javascript, Node and Apify gave us all the pieces we needed to address every one of our existing challenges, futureproof our platform and resume scaling up our web collection activities.

Related resources.


Daltix Data Architecture: Data Access

In the final blog post of the data architecture series, we’ll take a look at how Daltix can provide everyone who requires their data access in a way that suits them. To answer that, we’ll examine our two data setups and explain our experiences and how they fit into our data architecture.

Read article

What Daltix
stands for.

Integrity is our superpower

We envision and build
the future of data.

Collaboration is key.