You would like to setup a remote desktop with Screaming Frog to crawl huge websites (+1 million pages) ?
You only need your crawler to run once a month ? (few hours or days)
You have some basic technical skills and want to discover the power of the Cloud ?
An Ubuntu Remote Desktop in a cloud instance in OVH offers a good price to performance solution. (1,5 € per day / 2$ or half this price if you take a monthly subscription )
What we will do to get an army of crawler:
- Open an OVH account or login
- Create a new Cloud Project
- Create a Server (an instance)
- Specify in advance that we want Docker to be installed (it will make everything super simple to setup)
- Install a Docker container containing Ubuntu + A remote Desktop based on NoVNC
- Connect to Ubuntu with Chrome or any browser in one click 🙂
- Install Screaming Frog with 2 commands
- Create a Snapshot in Openstack (= OVH)
- Create as many server containing Screaming Frog in just ONE clic
Setup a new Cloud Instance
- Go here and create an account / login: https://www.ovh.ie/public-cloud/instances/
- Then here: https://www.ovh.com/manager/cloud/index.html
- Order > Cloud Project > Fill the forms
- In your project > Infrastructure > You can add a new server in “Actions” > Add Server
- Take the 7GB RAM & 100GB SSD for this test.
- You will need 60 GB of disk for 1 Million URL Crawled
- Setup your ssh key (Google is your best friend to get help, it’s OS specific)
Setup the server
- Connect to the server with a terminal
- user@IP of your server
- Then copy and paste each line one by one:
- apt-get update
- apt-get upgrade
- sudo docker run -it –rm -p 6080:80 -p 5900:5900 -e VNC_PASSWORD=MyPassWordToReplaceByWhatYouWant dorowu/ubuntu-desktop-lxde-vnc
Setup Screaming Frog
- Connect to https://IP-OF-YOUR-SERVER:6080/ with the password you used for “VNC_PASSWORD=”
- Open a terminal in Ubuntu (in the NoVNC session – icon in the bottom left of the Ubuntu desktop)
- Then copy and paste each line one by one:
- sudo apt-get install screen wget
- wget https://download.screamingfrog.co.uk/products/seo-spider/screamingfrogseospider_9.0_all.deb
- dpkg -i screamingfrogseospider_9.0_all.deb
- sudo apt-get -f install
- Screaming Frog is now installed 🙂
- You can try it here:
- Bottom left icon > Internet > Screaming Frog SEO Spider
- You will have to setup the Storage to disk to crawl huge websites
- Screaming Frog > Configuration > System > Storage
Next step: create a snapshot of this in OVH (to be continued if you liked this article !)

✓ Founder of Kelogs, a SaaS SEO Crawler & Log Analyzer
✓ International SEO Consultant (Freelance)
✓ Over 10 years of SEO experience
✓ Consultant to countless small and big businesses over the decade
Read more about me, myself and I.
Top 5 Crawling SEO Solutions for Huge Website – Quentin Adt
[…] Screaming Frog V9 (Linux / Windows / Mac OS) -> The big plus: native function to cross data from G.A., Search Console, MajesticSEO, Ahrefs… + You can cross with your log files ! +/- 60 GB of disk space for 1 million URL crawled. Read this post to setup Screaming Frog on Remote Desktop Ubuntu Cloud instance […]