New method to retrieve comps with data scraper

Hi Seth,

I’m new to the land investing world and I really appreciate all the knowledge you have to share. You and the community have really helped me hit the ground running.

I’m not sure if you are familiar with website scrapers, but I think I found a good method to download a massive number of comps from LandWatch. If you have a minute, I can show you so you can share with others.


@tmillen welcome to the forum!

Sounds interesting. I’m familiar with website scrapers, but I can’t say I use them much myself. Feel free to share what you know with the community! I’m sure everyone would love to hear more.

@retipsterseth Sounds good!

Ok, I saw the super helpful REtipster tutorial on pulling comps from Redfin but, for whatever reason, I was not given that option when I ran a query. Maybe Redfin took the option down or maybe it is an issue with my browser. Not sure.

Anyway! I looked for an alternate way to pull data from Websites and I came across a Google Chrome extension that blew my mind!

The extension (linked below) allows you to scrape entire lists from any website (Zillow, Redfin, LandWatch, etc.) with no limitations. You can literally pull 10,000 lines of data if you want. It’s also incredibly easy to use because it has AI that recognizes webpage formatting. Before finding this option, I tried 3 other scrapers and this one was by far the easiest and most intuitive.

All you need to do is go to your website of choice, put in your query filter, and then run the scraper. I know that some people (like myself) are visual learners, so let me know if you need me to show you.

There’s just one final step. Once you get the data into Excel, you just need to write a basic Excel formula to grab the Acres out of a text field.

Link for “Instant Data Scraper”:

Full disclosure: I have no affiliation whatsoever with this. I just thought it was super helpful to me and thought I should share.


@tmillen that sounds awesome! I’m gonna download it now and try to give it a whirl. Maybe I’ll make a tutorial video about it (if I can get it to work).

As for Redfin, they only make that functionality available in certain markets (something I realized after I originally made that video). I’m not sure why or what the rationale is, but that’s always been an issue with how it works. It must be that you were looking in one of the markets where the data isn’t downloadable like in the video. Sorry about that.

Either way… doesn’t sound like it matters all that much if you found an alternative. :wink:

1 Like

@retipsterseth Sounds good, Seth!

If you run into any issues, let me know. It took me a little while (probably half an hour) to work out some kinks but, once I figured it out, it saved me a ton of time!

To mine the number of acres from the text field in the Excel download, I used the following formula “=VALUE(LEFT(L1,FIND(” “,L1)-1))” → Cell L1 is where I had my Acre text field.

Thanks for the tip on Redfin. I will use that going forward in different markets for sure.




I forgot to mention that the formula I used was for mining data out of Redfin’s download specifically. Other downloads might need slight tweaks to the formula. I found that Redfin was the easiest to work with, closely followed by LandWatch. In theory, you can use this at any site though.

@tmillen sounds great!!!

1 Like

@tmillen so, once you have that extension installed on chrome, is it just a matter of filtering the property search however you want it in Redfin, opening the scraper, and downloading the data from that page as a csv file? I’m not sure if there are any other settings I should be tweaking or if it’s really that simple.

I can see how the downloaded data can be useful for this purpose, for sure.

I tried this with Zillow but didn’t get quite as much useful data (seemed to be missing the acreage portion of each property, which is a crucial part of the calculation).

@retipsterseth Exactly! Once you get your filter set up, you open the scraper on your Chrome browser (I have mine pinned to the top right corner). It looks like a Pokeball! 2d779aa5-48e6-48c6-8815-1f3d3b632381-image.png

If your search filter yields multiple pages of data, you need to tell the scraper to select the “next” button on the page. Just click the “Locate ‘Next’ button” on the scraper prompt and then click on the “next” arrow on whatever site you are on and the “next arrow” should highlight green (see screenshot below - I used LandWatch as an example)

You select
Once you highlight the “Next” button, you just need to “Start Crawl”. Crawling will enable the scaper to move from page to page to extract the data. You can end the crawl at any time. Sometimes the crawl will not stop on its own. If you see the number of extracted rows is no longer increasing, you’ve already extracted all available data. You can just stop the crawl by clicking the button on the top left corner.

The scraper will only yield results as good as the data published by the site. I found that Zillow does not consistently publish acreage for sold properties (at least in the counties I searched). I’ve had better luck with Redfin and LandWatch.

In case you really want to use Zillow, I believe the output Excel publishes acres to a field called “list-card-details”.


@tmillen i use this one, same idea:
comes with a chrome extension

quite a few data scrapers on the market now—the days are over of needing to pay a VA to scrape website data.


@tmillen Thank you for mentioning this great tool. Just curious what were the other tools you tried that were not as good?

1 Like

@mike-i I just looked around the Chrome store and tried a few out. It wasn’t that they were necessarily bad. I just couldn’t easily figure them out. I was using NoCoding Data Scraper for a while but I found it to be too complicated. Someone with stronger technical skills may prefer it.

1 Like

The one Sean mentioned has very good reviews as well and is also worth checking out.


@tmillen hHi! Two things are clear to:

  1. this can be a real game changer
  2. you definitelty master it
    So, in case that @retipsterseth likes the idea and, of course, you’re available to be there and share your experience, why not to go through a practical exercise during the open office hours?
    Whatever it will be decided fro the open office hours, thanks for sharing here on the Forum: I personally appreciate both the spirit and the willing to help other fellow land investors!

@tmillen @retipsterseth @arturo Happy 2-2-22!

How can we use this awesome tool to find Realtors w/ most Sold in a zip code?
Thanks so much for sharing, Todd & Seth!
Have a Terrific Tuesday!

1 Like

@arturo Absolutely! I’m happy to do a walkthrough.

1 Like

@tmillen hopefully the “Boss” will read this and see if feasible :grin:

1 Like

Hi @tmillen, thanks for posting the web scraper recommendation. I haven’t used the specific one that you mentioned, yet, but plan to give it a try. I tinkered around with some other comparable scraping tools previously for another purpose within the land niche, which is scraping parcel data for properties with delinquent property taxes from counties that post such lists online in a format that cannot otherwise be easily opened and manipulated in Excel. Just wanted to throw that out there in case you or others following this thread come across a county where obtaining such a list in this way might be helpful.


@thao-phan I think you can use this tool for so many different purposes. In general, if you can find the data on a web page, this tool should be able to extract it. I’m not nearly as familiar with real estate websites as Seth so he can better comment on the “how to”.

@dl7573 That’s an extremely useful application of the tool. Thanks for sharing. This can probably save people a lot of time for those counties that offer that option.