Extract URL is capable of finding URL information from different sources. This program is definitely not intended for the common user, but may be extremely useful for website owners, network administrators, and SEOs.
However difficult this procedure may seem, the difficulty is not in using the program itself but in interpreting or knowing what to do with the results. An intuitive interface will help you operate the application without much complication. In this respect, the program starts by creating sessions. Although it is not a real wizard, entering the required data is easy if you just follow the same order of the tabs. It all begins by specifying the source. Regarding this, there are three possible search ways, you may extract the URLs from an entire website, a file containing URLs or search engine results. At this point also, you may decide whether to include external sites or use filters. In addition, in the necessary cases, you may provide a password for the site or configure a proxy connection. After setting the extraction parameters, you can ask the program to start the procedure.
The results are presented in the main screen and they include data, such as url, base, domain, title, description, keyword, date modified and page size. In addition, you will be able to export the report in various formats.
However, this program also has some limitations. It is unable to follow redirects. Also, it will be incapable of extracting URLs resulting from unconventional navigation, such as Java applets or complex scripts. Finally, it will also fail to follow pages in a frameset which are pulled from another domain as it will not process pages from domains other than the specified ones.
- It is easy to use.
- It extracts URLs in three different ways.
- It allows exporting the results easily.
- It cannot handle URLs in redirects, complex scripts or external URLs in a frameset.