Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revisionBoth sides next revision
proxy_scraper:options [2015-06-10 22:07] – [Filter] devinproxy_scraper:options [2015-07-15 13:29] – external edit 127.0.0.1
Line 4: Line 4:
  
 ===== Settings ===== ===== Settings =====
 +{{ :settings.png |}}
  
 ==== Internal Proxy Server ==== ==== Internal Proxy Server ====
Line 31: Line 32:
  
 ===== Provider ===== ===== Provider =====
 +{{ :provider.png |}}
  
 A website that offers free proxy servers is called a //provider// within GSA Proxy Scraper. The software comes with over 800 providers.  A website that offers free proxy servers is called a //provider// within GSA Proxy Scraper. The software comes with over 800 providers. 
Line 41: Line 43:
 When having this option checked, it will parse all those links for new proxies. When having this option checked, it will parse all those links for new proxies.
 Often new proxy sources are found that way that you can later add here as well. In the proxy list it is shown as "//Proxy-Search Links - ...//". Often new proxy sources are found that way that you can later add here as well. In the proxy list it is shown as "//Proxy-Search Links - ...//".
 +
 +Another option called "**Use Search Engines to locate proxy lists**" will activly use search engines (google, bing...) with your queries (**Other**) or the found **IP**s itself to find new proxies.
  
  
Line 52: Line 56:
  
 ===== Automatic Export ===== ===== Automatic Export =====
 +{{ :export.png |}}
  
 There are many options for you to export proxies automatically. You can define an interval and different ways to export. There are many options for you to export proxies automatically. You can define an interval and different ways to export.
Line 64: Line 69:
 Each Export offers many different filters to be used. If you plan to create an export for //GScraper// (a famous tool for search engine parsing) you must make sure that only proxies are exported having an IP (filter option **Exclude proxies with a domain**) as //GScraper// will not import anything if there is a domain in it. Each Export offers many different filters to be used. If you plan to create an export for //GScraper// (a famous tool for search engine parsing) you must make sure that only proxies are exported having an IP (filter option **Exclude proxies with a domain**) as //GScraper// will not import anything if there is a domain in it.
 ===== Automatic Search ===== ===== Automatic Search =====
 +
 +{{ :auto_search.png |}}
  
 Having this checked means the program will search all the providers for new proxies. You can define the **Interval** here as well as the conditions on when this should happen or stop. Having this checked means the program will search all the providers for new proxies. You can define the **Interval** here as well as the conditions on when this should happen or stop.
Line 83: Line 90:
  
 ===== Filter ===== ===== Filter =====
 +
 +{{ :filter.png |}}
  
 I don't recommend you to use any of the filter options as you can always define filters for the automatic exports. However for those who want this, the following options are available: I don't recommend you to use any of the filter options as you can always define filters for the automatic exports. However for those who want this, the following options are available: