How to Scrape Google Search Results for Free [2024 Edition]

Shehriar Awan
January 17, 2024
14 min read

Ever needed Google SERP data for hundreds of keywords at once? Traditional SEO tools can’t do it, and it's frustrating.

Collecting data from Google search results for lots of keywords can be a headache. Doing it manually might take eons. And traditional SERP analysis tools are pretty much useless for this.

In this article, we’ll learn how to extract all Google search results from hundreds of keywords for free.

But is it useful for SEOs only? Let’s explore some other vital use cases of scraping Google SERPs.

Why scrape Google search results?

Google dominates the search engine market with almost 92% market share. Every month more than 84 billion people use the world’s #1 search engine.

Google is world's most visited website

This makes the most visited website in the world a goldmine of data for various use cases, including:

  1. Market Research: You can identify competitor presence, market gaps, and consumer sentiment.
  2. SEO (Search Engine Optimization): It can help you create competitor strategies, identify keyword opportunities, and track content performance.
  3. Academic Research: Google SERP scraper can be used for collecting all available content on a topic from Google search.
  4. Journalism: Journalists can use it for research, finding articles, social media posts, and all content found in Google related to a topic.
  5. Fact-Checking: You can save time skipping manual search and find the facts faster.

But does Google allow web scraping? Let’s find out.

Google does prohibit scraping but they never tried to enforce it seriously. Like a stackoverflow user said; no lawsuits till now:

Google did not file lawsuite against Microsoft

It’s not just Bing, most SEO tools scrape Google. As Semrush mentions in their knowledgebase:

Semrush's data collection source

Why is Google silent about it? The simple and sweet answer is – Google doesn’t care. As r/LopsidedNinja says:

Google does not care

Google itself scrapes that data from the web. But that’s possible because the websites allow crawling in their robots.txt file. So if it’s harmless, it’s allowed.

Also Google search results are publicly accessible information. After LinkedIn vs HiQ Labs judgment, it is established that scraping publicly available data is completely legal.

Linkedin vs HiQ labs judgement

Hence scraping Google search results is legal. But you can get in trouble for copyright infringement if you reuse copyrighted content without consent.

But Google has tons of APIs, why scrape it if we can use the official API?

Is there any official Google search API?

Yes, Google does have an API for collecting and displaying search results. You can use Custom Search JSON API for this. But like any other official API, it has limitations.

  1. It’s extremely expensive
  2. You don’t get actual search results always
  3. Limits and pricing keeps changing

Let’s elaborate on each limitation.

Firstly,the official Google API is extremely expensive.

You can only perform 100 requests per day for free. With a paid plan, the API allows only 10k requests per day and you have to pay $5 per 1000 requests.

Google Custom Search API pricing

Secondly, the data is not useful. You won’t get real-time search results. The API data differs from actual SERP data. This makes it useless for commercial use, especially SEO.

Google Custom Search API is useless

Lastly, even if it fits your requirements, the usefulness remains uncertain. Google keeps updating the limits and pricing.

Google keeps updating API pricing and limitations

Other than Google official API, we can also use 3rd party SERP APIs. There are a lot of them available. But again, they’re expensive and require coding.

So why not just code a scraper without any API?

If you’re a good programmer, you can scrape Google search using python or any other programming language. But it’s hectic, complex, and non sustainable.

Google SERP webpage structure

Look at Google’s web page structure. Even if you manage to parse this html using beautifulsoup or lxml, Google keeps updating the page structure.

If you use selenium, you’ll have to deal with captchas. But that’s not the worst part.

Google will simply block your IP address. Using proxies can be extremely expensive too. As Mr. John shares his experience in this stackoverflow post:

Problems with custom scraper

So for peeps like me who don’t have a fortune to spend and don’t like coding, both these solutions are useless. The only solution we have is using a no-code tool.

How to scrape Google search results without coding

There are a lot of no–code web scraping tools. But for this article, I’ll be using the one that I believe is best in the market – Google Search Scraper by Lobstr.io.

Google Search Scraper by Lobstr.io

Features

  1. 13 data attributes including PPA, related searches, and rich snippets
  2. All countries, regions, and languages
  3. Ads and omitted results included
  4. Super fast, scrape 130+ results per minute
  5. Cloud-based and has cool schedule feature
  6. Custom exports, including Google Sheets, Amazon S3, SFTP, and webhook
  7. Developer-ready API

Pricing

Lobstr offers a transparent and affordable pricing range that suits every pocket.

Lobstr.io is affordable

  1. Free plan: Free forever, 58k results per month
  2. Premium plan: €0.07 per 1000 results
  3. Business plan: €0.04 per 1000 results
  4. Enterprise plan: €0.03 per 1000 results

Let’s learn how to scrape Google search results using Lobstr.io

Step by step tutorial to scrape Google SERP using Lobstr.io

We’ll scrape Google search results within a few minutes in 6 simple steps.

  1. Get Google SERP URL
  2. Create new squid
  3. Add tasks
  4. Adjust behavior
  5. Launch
  6. Enjoy

Let’s go! 🏃

1. Get Google SERP URL

To scrape Google SERP data, all you need is a SERP URL. How do we get that? Let me hit 2 birds with 1 arrow. I’ll track how our latest article is performing in different regions in the US.

I recently published how to scrape TikTok ads without coding. If you were looking for a perfect TikTok ads scraper, do check it out.

We’ll track its SERP position in New York, California, and Texas. Let’s enter our search query in Google and copy the URL.

Getting SERP URL

Copy the URL and go to Lobstr.io, and create new squid. Here’s our URL: https://www.google.com/search?q=scraping+tiktok+ads

2. Create new squid

To launch Google Search scraper, click the create new squid button on the dashboard. Now search “Google search” and select Google Search Scraper.

Create a new Google Search Scraper Squid

Tada! ✨Your scraper is ready.

3. Add tasks

Next, we’ll add tasks. In this case, task means the SERP URL. Paste the URL you copied here. You can also add URLs in bulk using the upload file option.

Paste SERP URL to the Squid

You can add as many tasks as you want. Once done, click the Save button to proceed to settings.

4. Adjust behavior

Let’s adjust our scraper’s behavior using settings. We have 3 options to adjust our Squid’s behavior.

  1. Basic settings
  2. Advanced settings
  3. Notifications

Basic Settings

Firstly, we have basic settings. You can specify how many pages you want to scrape using the Max Pages option.

Set maximum number of pages to scrape

Leave it blank to scrape all available search result pages. You can toggle Mobile Results to scrape mobile search results.

Next is – When to end run. Select end run once no credit left to stop scraping when you’ve consumed your daily credits. This is best for getting fresh data every time you initiate a run.

Choose when to end the run

If you’ve got hundreds of URLs to scrape, select end run once all tasks consumed. This will pause the run once you’ve used all the credits and resume it the next day.

I’ve explained how this works in my how to scrape TikTok ads article. Do check it out.

Since we only have 1 URL and we need fresh data on every launch, I’ll choose the first option.

Advanced settings

With that, we move to advanced settings. First option is Results per page, which allows you to set number of results displayed per page. It’s not so useful in our case.

The second option is where it gets interesting. This is where we choose the country, language, and specific location of our SERP data.

I need to know what position my article is ranking in New York. Let’s adjust our boi.

Adjusting crawler to fetch NY SERP data

Next is concurrency i.e. number of bots deployed per run. You can deploy multiple bots for faster data extraction.

Concurrency means number of bots deployed

Check the pricing page to learn about the number of bots your plan includes. For example: with premium plan, you can either create 10 Squids or deploy 10 bots on a single Squid.

1 slot means 1 concurrency or squid

You can remove duplicate results using the Unique Results filter. If you prefer Excel for viewing data, toggle No Line Breaks to remove line breaks from text fields.

Once done with settings, click save and you’ll see the notifications menu.

Notifications

You can opt to receive email notifications upon success or failure of a run.

Set notifications

You don’t have to open your dashboard to know how your last run ended. Select On success to receive an email notification when the run ends successfully. Choose On error for vice versa.

With that, we’re ready to launch our scraper.

5. Launch

This is our final step. Here we configure our launch sequence 🚀 and boom!!! Google Search Scraper starts collecting data. There are 2 ways to launch the Squid:

  1. Manually
  2. Repeatedly

Choose how to launch the crawler

If you want to launch the scraper instantly, select Manually and click Save & Extract button. Next thing you see is a live console for monitoring progress and real-time results.

Google Search Scraper live console

But what if I want to track the SERP position daily or weekly? I obviously want to do that. That’s where the schedule feature will help us.

Schedule

You can schedule your Squid to run automatically on time and frequency of your choice. To do that, select Repeatedly and set a schedule.

Schedule launch

I’m scheduling my Squid to run every Monday at 12PM. This way I can track weekly changes in rankings.

6. Enjoy

Since I’ve to track 3 states, I created 2 more Squids to track Florida and California. They’ll all run on scheduled time every Monday.

List of squids

But I can’t wait till Monday. After launching all 3 squids manually. Let’s download the .csv files and see the results.

Starting with California, our scraper collected 352 pages, and we rank #8 in SERP woohoo!!

California Google SERP data

In Texas, Google gave us 268 search results, and our boi ranks at #8, again ✨

Texas Google SERP data

And finally New York, the region we targeted specifically; 228 total results, and we rank… #6. That’s some great news.

New York Google SERP data

So the tool is awesome, but I prefer Google Sheets. Downloading a csv file, then importing it to Sheets is hectic. Is it possible to sync a Google Sheet to our Squid?

Delivery

Lobstr’s all Squids can be synced to Google Sheets, Amazon S3, SFTP, and Webhook. To do this, just click the delivery button in the console and configure your preferred service.

Set delivery

Now you don’t have to open the dashboard and download results manually. They’ll be updated to your preferred service automatically after every run.

But is it actually better than SEO tools? Let’s find out!

How is Lobstr’s Google Search Scraper better than SEO tools?

For comparison, I’m using comparing this boi with 2 SEO giants i.e. Ahrefs and Semrush. Both tools offer the rank tracking feature.

But why are they the best? Well instead of searching for “best seo tool”, I prefer reading consumers’ responses. Here is 1 TwitterX polls for example:

https://twitter.com/PeterMindenhall/status/1706704940435906802

Let’s draw a side-by-side comparison first:

FeaturesLobstr.ioSemrushAhrefs
SERP UpdatesReal-timeDailyWeekly
Local TrackingYesYesNo
Accuracy ComplainsNoneSomeNone
PricingAffordableModerateExpensive

Now let’s dive deeper. First, I’m going to use Ahrefs’ rank tracker.

Ahrefs rank tracker doesn’t support local rank tracking. You can’t see your ranking in a particular city or state.

This can often mislead you while doing competitor research or local optimization. Let’s understand it with an example.

Ahrefs position tracker

Here I added my targeted keyword in the rank tracker. It says we’re ranking on #10 in the US. But this seems incorrect 🤔. We just confirmed we’re ranking #6 in New York.

Why this inaccuracy? That’s because Google SERP rankings may vary city to city. Like we rank #8 in 2 states and #6 in New York.

Google search uses context

For local SEO, knowing your SERP position and your top competitors in that targeted region is crucial. You can read this Ahrefs guide to learn more about local SEO.

Lobstr allows you to choose the city, state, and language you’re targeting. Giving you accurate rankings for local SEO optimization.

Second issue with Ahref is outdated results.

As the screenshot says, these rankings were last updated on 9th of January. Which means they’re almost 1 week old. Some users find it frustrating.

Ahref only updates once in a week

Lobstr provides real-time SERP data from Google. Every time you initiate a run, the results you get are exactly what your audience would see when they search that keyword.

Another issue with Ahrefs is – it’s expensive.

Ahrefs pricing

The basic plan costs $99 per month. Plus you’ll have to pay extra $100 per month for daily rank updates. Adding additional keywords costs you extra $50 per month.

Ahrefs limitations

Lobstr costs you only €50 per month. You can track unlimited keywords. Plus get real-time SERP updates. No extra charges.

Let’s see how Semrush’s rank tracker performs.

Semrush allows you to target a specific city or even a specific zip code. Its data is also updated daily.

Semrush Rank Tracker

But users do complain about data inaccuracy.

Semrush Position Tracking Inaccuracies

This was further validated by other users in comments:

Semrush inaccuracies validation

One thing I really love about Semrush – is their support. They’ll respond to you whether you’re on LinkedIn, X, Reddit, or any other social media. Check the reddit comment out there.

But the main issue is – It’s the most expensive tool.

Pro plan costs you $130 per month. You can only create 5 projects and each project supports only 1 location. This means you can track 5 locations.

Semrush Pricing

It also limits total keywords. You can only monitor a maximum of 500 keywords in Pro plan.

It’s way more expensive if you compare it to Lobstr.io which allows unlimited keywords in all plans. Plus with Lobstr’s pro plan, you can track up to 10 locations.

Slots are equivalent to semrush projects

But Lobstr also has a limitation. Let’s discuss that too.

Limitation

While Lobstr.io is the best SERP in the market, it does have a limitation. Google shows a max 500 results per search, so the scraper can only extract up to 500 results per URL.

How to bypass the max Google results limitation?

You can split the search into multiple URLs using related keywords.

  1. We can use keywords like scraping tiktok ads free, scrape tiktok ads, etc.
  2. You can also use Google advanced search and custom time range.

Google Custom Time Range filter

FAQs

What is SERP?

SERP stands for Search Engine Results Page. It's the page you see after you enter a query into a search engine like Google. In our context, it’s the Google search results page.

Can I scrape Google ads too?

Yes, this scraper extracts both paid and organic results. This means you can scrape Google ads too.

Can I get SEO data like Keyword Difficulty, Volume, etc?

No, Google doesn’t display volume and KD in SERP. In fact keyword difficulty is an estimate by SEO tools. So, no you won’t get these details. This tool is to scrape data from SERP.

Yes, you get People Also Ask and Related Searches data too. Which makes it a perfect choice for keyword research.

Can I scrape Google News with this tool?

No, if you want to scrape news articles specifically, this tool is not the ideal choice. You can try the Google News Scraper.

Can I scrape Google Maps with this tool?

No, this scraper specifically collects SERP data only. To scrape Google maps listings, you can use the Google Maps scraper. For scraping reviews, use Google Maps review scraper.

Can I use AI to scrape Google?

Unfortunately No. If you want to scrape Google SERP data at scale, there’s no reliable AI solution available yet. But you can use GPT to help you code a scraper.

Do I need proxies for scraping Google SERPs?

No, with Lobstr, you don’t need proxies. We’ve got you covered. But yes, if you’re creating your own scraper, you may need proxies.

Conclusion

That’s a wrap on how to scrape Google search results without coding. You can use Lobstr.io Google Search Scraper not only for data collection but also a perfect, affordable, accurate SEO tool.

Go on, create a free account and give it a spin.

Shehriar Awan - Content Writer at Lobstr.io

Shehriar Awan

Self-proclaimed Head of Content @ lobstr.io. I write all those awesome how-tos, listicles, and (they deserve) troll our competitors.