- Kali Linux Web Penetration Testing Cookbook
- Gilberto Nájera Gutiérrez
- 419字
- 2021-07-16 12:53:50
Downloading a page for offline analysis with Wget
Wget is a part of the GNU project and is included in most of the major Linux distributions, including Kali Linux. It has the ability to recursively download a web page for offline browsing, including conversion of links and downloading of non-HTML files.
In this recipe, we will use Wget to download pages that are associated with an application in our vulnerable_vm.
Getting ready
All recipes in this chapter will require vulnerable_vm running. In the particular scenario of this book, it will have the IP address 192.168.56.102.
How to do it...
- Let's make the first attempt to download the page by calling Wget with a URL as the only parameter:
wget http://192.168.56.102/bodgeit/
As we can see, it only downloaded the
index.html
file to the current directory, which is the start page of the application. - We will have to use some options to tell Wget to save all the downloaded files to a specific directory and to copy all the files contained in the URL that we set as the parameter. Let's first create a directory to save the files:
mkdir bodgeit_offline
- Now, we will recursively download all files in the application and save them in the corresponding directory:
wget -r -P bodgeit_offline/ http://192.168.56.102/bodgeit/
How it works...
As mentioned earlier, Wget is a tool created to download HTTP content. With the –r
parameter we made it act recursively, which is to follow all the links in every page it downloads and download them too. The -P
option allows us to set the directory prefix, which is the directory where Wget will start saving the downloaded content; it is set to the current path, by default.
There's more...
There are some other useful options to be considered when using Wget:
-l
: When downloading recursively, it might be necessary to establish limits to the depth Wget goes to, when following links. This option, followed by the number of levels of depth we want to go to, lets us establish such a limit.-k
: After files are downloaded, Wget modifies all the links to make them point to the corresponding local files, thus making it possible to browse the site locally.-p
: This option lets Wget download all the images needed by the page, even if they are on other sites.-w
: This option makes Wget wait the number of seconds specified after it between one download and the next. It's useful when there is a mechanism to prevent automatic browsing in the server.
- Java逍遙游記
- iOS面試一戰(zhàn)到底
- 零基礎(chǔ)學(xué)C++程序設(shè)計
- 劍指JVM:虛擬機實踐與性能調(diào)優(yōu)
- Java應(yīng)用開發(fā)與實踐
- 深度學(xué)習(xí):算法入門與Keras編程實踐
- SQL Server 2016數(shù)據(jù)庫應(yīng)用與開發(fā)
- Python時間序列預(yù)測
- C語言程序設(shè)計教程
- Spring快速入門
- 現(xiàn)代C++編程實戰(zhàn):132個核心技巧示例(原書第2版)
- Spring技術(shù)內(nèi)幕:深入解析Spring架構(gòu)與設(shè)計原理(第2版)
- Illustrator CC平面設(shè)計實戰(zhàn)從入門到精通(視頻自學(xué)全彩版)
- Arduino可穿戴設(shè)備開發(fā)
- Learning Splunk Web Framework