For a long time now the scrapebox backlink checker has used Alexa data. Alexa all of a sudden started limiting the data from 1000 backlinks to now only 5 backlinks, if you want more, you have to pay for Alexa Pro.
So Scrapebox transitioned to using MOZ API, similar to the Page Authority addon. In the video below I cover how to setup an API key, how to setup the addon and how to use it.
The major upside here is that MOZ data is considerably better and much fresher then Alexas. Also it will pull in your accounts from the page Authority addon , if you have any setup in there.
Neil Patel did a great write up on using scrapebox for whitehat SEO. You can find the full article here: Scrapebox Whitehat SEO
There are MANY ways you can use scrapebox for whitehat SEO, and Neil covers some great methods in the article. There are of course many more uses, if you just learn how to use Scrapebox and be creative, but neither he nor I can give away All the secrets of how we use scrapebox.
I can assure you though, as part owner of a full brick and mortar seo company, we use scrapebox for many things “outside the box” of what all too many marketers stereo type scrapebox as. Be creative, think things thru, you will find dozens if not hudreds of white hat applications. If your looking for black hat, don’t think about scrapebox as a blog commenter, just pretend it doesn’t even do blog commenting, then you will be able to think about it more creatively!
You can use scrapebox to post comments on your own blog, as well as to post content to your own blog.
The Article Scraper Plugin has all tools needed for scraping content, spinning, Quality Assurance, and posting the content to you blogs.
Then you can use the comment poster in scrapebox to post comments on your own blog. Scrapebox even created a plugin that allows you to mass post on your blog and bypass the standard commenting limit. You can post as many comments as you like, and I cover how to make it all auto approve so that you do not need to go in and approve all comments.
You can even build links to your blog when your done.
The rank tracker plugin now features a scheduler. The scheduler allows you to schedule the rank tracker to run while you sleep, eat dinner or any other time. You can schedule it to run every day, automatically.
It tracks all data and even can export the data daily to a excel file. All you have to do is update to the latest version of the rank tracker, you do not need the automator plugin.
The latest version of the Google Image Scraper now features multiple image licensing options. This comes with V 220.127.116.11 of the Image Grabber addon. This is helpful if you want to be able to get images that have various licenses.
Also Facebook had made some changes to their platform that inadvertently caused scrapebox to stop being able to scrape emails from facebook. However the next update has fixed that and you can now scrape emails from facebook.
Ever thought about wanting to build High Authority, High PR, Dofollow links, Good Alexa Rank, Low OBL Links? Its a SEO dream right? How often do you think, oh Scrapebox can help me do that? Unfortunately far too few people realize that Scrapebox can do just this.
I cover concept in this video. You can use the concepts here to build High PR, High Authority, Dofollow, Good Alexa Rank, Bad site Filtered, Low OBL links. Excellent links like this aren’t going to fall into your lap, you have to look for them. I used to spend hundreds of hours doing this by hand, scrapebox lets you filter a list and find places to build such links, all with only a few clicks. In fact its one of the original reasons scrapebox was built.
Did you know scrapebox was built for the developers own personal SEO use, just to automate tasks they were doing by hand. Similar stuff to what Im talking about in the video. Then they sold scrapebox to a few friends, then it was so helpful and popular it blew up into one of the most widely used SEO tools that it is today.
The developer didn’t want to do this stuff by hand, so they automated it. I used to do all this by hand as well, and thats why I bought scrapebox, to automate the same sorts of stuff. It wasn’t until down the track I ever figured out I could auto comment on blogs with scrapebox.
The BEST value in scrapebox is NOT mass blog commenting. That has its place, but the Juiciest links are found by using the techniques in the video along with the various addons and functions scrapebox has.
Make sure when you watch this you don’t look at it from “how can I do exactly what he is talking about” but rather look at it from “how can I leverage this concept on all sorts of platforms, in all sorts of scenarios to gain the best links possible”.
Then use the stealing your competitors backlinks process thats also on my youtube channel, I recorded just a couple weeks ago. After you find your competitors links, PR check them, Alexa Check them, OBL check them etc.. Just like I show in the video, apply those techniques and find your competitors best links. Then go eyeball them for a bit, then reverse engenieer the concept using the techniques in the video. You will find with a little effort you can pull in the links that most SEOs only dream about or that they pay TOP dollar for. I know, Ive sold these sorts of links and lists of them in the past. Now days I only use them for myself, I don’t sell them. They are too valueable to sell. Happy Linking!
This video covers how to safely scrape google using private proxies.
Private proxies can be very fast and they are reliable because you are paying for them. That said using only a couple of connections on private proxies can be as fast as 10-20 times that many connections with public proxies.
Another advantage is that you don’t have to constantly obtain lists, you simply keep using your private proxies because you don’t get them blocked. This method also generally works with shared proxies as well.
The custom harvester has been available in the main scrapebox GUI for some time now. If you haven’t used it its a great tool that allows you to scrape 20+ search engines and allows you to add your own.
Now you can use the custom harvester in the automator. So you can setup all engines to scrape and then do everything else that you want to do as well.
Also in the next update you will have the option in scrapebox to adjust retries for proxies for the custom harvester.