Web Succ Quickstart
Install Python locally or use a cloud IDE the copy and paste the Web Succ code
Create Google Sheet, get API credentials for Twitter and WebHose
Choose which keywords you want to monitor on Twitter and across the web
Features Coming Soon
Enchanced Filtering Options
Real Time Alerts
Get updated on new features
Subscribe to Youtube
Keyword Monitoring – Track your selected keywords in real time on Twitter. Filter based on account size to identify influencers
PR – Put out fires early by identifying people talking about your business online
Lead Generation- Find people who are interested in the niche you are monitoring
Data Collection – Create unique datasets for content marketing purposes or to analyze trending topics
The first thing you need is an environment capable of running a python script. The easiest way to accomplish this is to create an account on coder.com, this will give you a live VS code IDE running on the cloud where you can run your code without having to install anything locally on your computer.
CODER IS CURRENTLY NOT TAKING NEW SIGNUPS, IF YOU WANT TO USE WEB SUCC YOU WILL NEED TO INSTALL PYTHON LOCALLY
You can follow my tutorial for setting up Python and VS Code editor on your computer here
Twitter API credentials
To pull data from Twitter you’ll need a Twitter API key. To get that you need to apply for a Twitter developer account here, which is an easy process and you’ll generally get an automated reply within 10-15 minutes confirming your developer account. Once your account has been confirmed go to your app and then get your API Keys and Tokens by clicking on the “Keys and Tokens” tab:
Create Google Sheet and enable Google APIs
Next you’ll need to create the Google Sheet that you want to receive your data. Once you’ve got that sheet made you need to go to console.developers.google.com and create a project. Then click on “Enable API and services” button at the top of your screen.
Once you’re in the API Library you’ll want to enable the Google Sheets and Google Drive APIs. The last thing we need to do now is create a Service Account that allows us to use these now enabled APIs. Go back to the “APIs and Services” dashboard and click the “Credentials” tab and create a “Service Account Key” from the drop down option
Name the service account whatever you want but make sure you set the role as “project owner” and then download the JSON file containing your credentials.
Web Succ Tutorial Video
If you have any trouble during the installation process, you can ask questions in the youtube video comment section
The Webhose API allows you to pull data from all across the web and filter it based on a variety of conditions. The first thing you will need to do is get an API key which will allow you to make 1000 requests a month for free.
Go to webhose.io and create your account once you’ve done that copy and paste your API key from the dashboard
Working with WebHose
To get started we will go over some common use cases, though the options are really only limited by your creativity. I recommend using the API playground available in your account dashboard to practice before using the Web Succ integration so you can get a visual idea of how things work. The below screenshot shows how the UI allows you to add filters just by clicking.
You can then either make the API call and see the results displayed or copy and paste the query the query provided into Web Succ if you want to have the data sent to your Google Sheet. Make sure you have the tab “Python” selected if you want to get your query from the UI.
Query Formatting Fundamentals
keywords – The key thing to keep in mind when choosing your keywords is using general vs exact match. Exact match keywords are specified by wrapping the word in quotation marks. Without the quotation marks around a phrase WebHose will pull results with those words in any order, meaning you will get a far larger set of results that probably aren’t what you are expecting. An example of this would be python programming vs “python programming”. The first keywords without the quotes would include articles about actual living pythons while the second would only select articles that include instances of “python programming” used together.
AND/OR – AND/OR allows you to get more specific with your keywords. Continuing the example from above, the problem with just using “python programming” as an exact match would be that many articles about python programming don’t include that specific phrase, we could potentially filter out a lot of the results we are looking for. We can solve this issue by using the AND/OR keywords. To optimize our filter we would do this:
(“python” AND “programming”)
what this means is that WebHose will now return results for any article that contains the words python and programming, rather than just articles where “python programming” is used in that specific order.
The OR keyword can be used to find articles for multiple words or phrases. If we wanted to find keywords that fall under a similar topic we can use the OR keyword. Let’s say we want to send data related to marketing to our spreadsheet, our query could look like this:
(“content marketing” OR “SEO” OR “facebook ads”)
Filters – WebHose provides numerous filters in addition to keywords. The general format is:
So if we wanted to return only articles in English it would look like this:
Sorting options – The default sorting option returned by WebHose is the date is was crawled. Generally this is the recommended method to use for sorting your data but there are cases where you may want to use the other available options. An example would be sorting articles about a certain topic by Facebook shares to create an article like “Top 10 most shared articles about x in the last month”
To do this you can use the built in filter support in Web Succ via the “sort” parameter, you can pass the sort_options object with your chosen sort option like below:
Currently Web Succ defaults to filtering by crawl date but also has support for social shares and domain rank. Other sorting options will be added over time but these are the most commonly used sorting options for most use cases.
Monitor Reddit posts for specific keyword – This example makes use of the site filter and is_first filter to monitor only Reddit posts themselves for certain keywords while ignoring comments:
Find backlinks by looking for plain text of your site URL – It’s not uncommon for references to your site to be in plain text rather than as a backlink, the result is that you don’t get any SEO benefits. This query can be used to find URLs with these plain text references to your site so you can try and contact the website owner to convert it to a backlink. Common keywords to use would be your business name and variations of your website url like “website.com” and “http://website.com” we will also sort by domain rank so that the highest value sites are at the top of our list:
Shutting off Web Succ – Use CTRL + Z in the terminal to stop the program