Travis KellerParticipantJuly 18, 2018 at 3:26 pmPost count: 62
To get comps are you guys manually going to each of the land and real estate websites and typing in county and city names then sorting them by price and yada yada yada…?
Or is there a quicker way?
I’ve put together a web scraper spreadsheet that pulls size and price and other data from landwatch.com priced low to high but it stops working after querying 5 to 10 counties.
I wonder though if I could put together a web scraper where you type in the name of one county or city and it spits out comps for all the top entries from all the real estate sites we use, would other community members get use out of that?
Or is everyone in on a big secret I’m not yet aware of and finding comps takes a trivial amount of time for each of you?
I know jack said he has a sheet that gives the average price of houses from Zillow and trulia and other sites so if it’s possible then maybe just an expansion of that?
I know there’s some clever people in the group so I’m interested to hear how u all go about it. ThanksKevin FarrellModeratorJuly 18, 2018 at 5:07 pmPost count: 898
Travis – Everyone probably does this a bit differently. I get comps from landwatch.com and from Zillow. I usually check LW first and use that number in my data. Then I check Zillow just to see if it is significantly lower. Sometimes I have to adjust for lower comps on Zillow. I would say that most times I just use the comps that I get from LW and run with it. I will adjust the data by throwing out the lower ones from other land dealers and maybe some other adjustments to pick my comp price for similar parcels. You can study this until the cows come home, but in the end you have to make a judgement on value and what your offer will be. The thinner the data, the lower my offer will be. I found that I would much rather get more hate, than to have a bunch of offers out there that are too high for me to purchase.
I know that Jack is a data guru and he preaches the data thing all the time. That’s great if you have all of his resources and experience. If not, run with what you have. Make the best decisions that you can with the data that you have. Hedge your bets with low offers so you can’t lose. Send out the offers and get going. This is how you learn to read and use the data.
If you need a little extra help after doing all the right things, set up a conference with Jack to review your data set. Good luck.Darald Berger (@AggieLand)ParticipantJuly 19, 2018 at 9:43 amPost count: 165
@tkells are you trying to scrap every listing on LandWatch? You said it working after 5-10 counties.
I had a Fiverr developer create a scrapper that runs in FireFox via iMacro plugin for LandWatch, RedFin and Zillow. I go to each site, put in my custom search criteria (land only, size, etc) that I want the data for. Then I copy that page usl for this “custom” search and put that into the txt file for the scraper to use when it runs. I can enter as many urls in the txt file as I want to scrap. It works very well if the comps are available. Obviously you have to review the data and pull out any anomalies or data you don’t want included.
With all that said, I still will look at some of the lowest comps for areas within the county to base my offer prices off of. I don’t offer one price for the entire county. If comps are not available for an area, then give it an educated guess and error on the low side like @rotortech suggested above.Travis KellerParticipantJuly 19, 2018 at 12:58 pmPost count: 62
Darald, are you also having to get a separate URL for each county/city name you want to query as well? Or can u just get one URL from each site and use it for as many locations as you want without having to fetch a new one? Great idea enlisting help from someone on Fiverr. That’s leverage.
As for the 5-10 counties thing, I just mean the spreadsheet works with about 5-10 entries before all of the importxml functions just lock up and perpetually say ‘loading…’. Don’t want to get too bogged down in the technical discussion of it on this forum (happy to PM) but unless I can find a way around it, if it’s worth it I’ll just figure out how to write a google apps script to run and do it for me. It would behave similar to the plugin you’ve described, except there would be no copying of URLs. Just input your parameters.
The spreadsheet in its current form can be found here https://docs.google.com/spreadsheets/d/1GWwUJR5278Bc7X7myeA0C_8v0RSdaiLDjUngAD-8ECo/edit?usp=sharing
After listening to your much-appreciated input though, Kevin, it may be more hassle than it’s worth. I have the technical ability to import and parse all that data, just a matter of whether it’s worth the time and effort. I like hearing that each of you has your own technique and that the basics are just that -the basics.
Thank you gentlemen. Open to hearing others’ opinions as well.Darald Berger (@AggieLand)ParticipantJuly 19, 2018 at 2:05 pmPost count: 165
@tkells WOW we need to talk.. LOL
Yes, I am/have to enter a unique URL for each custom search. Also, something to keep in mind, most sites have a maximum number of pages that they will return in a search. For example, if you search RedFin for “Texas, USA” and only limit the search by “land” criteria is shows over 22,000 search results. But, RedFin has a maximum of 17 pages of 21 results/page (357 results) that it will return, so that limits my scraper and is sometimes why I would have to do multiple URL’s to capture all the comp data if the criteria is broad.
Maybe your code doesn’t have this limitation. Also, are you able to include additional limiting criteria other than
Again, I am a pretend geek, hence why I needed a guy on Fiverr. 🙂 He’s actually working on a Census track/block enhancement for me that I’m excited to see the results of.
You must be logged in to reply to this topic.