Command option Sample:wget --no-directories

Search command sample in the internet.It is the result.

wget --no-directories --content-disposition --restrict-file-names=nocontrol -e robots=off -A.pdf -r \ url    How to download all files linked on a website using wget - Unix & Linux Stack Exchange : http://unix.stackexchange.com/questions/246809/how-to-download-all-files-linked-on-a-website-using-wget
wget --no-directories --level 1 --recursive http://ftp.gnu.org/gnu/bash/bash-4.3-patches/ for patch in     - rtt - IT Technology Resource - Dedicated Root VPS Virtual Colocated Web Server Hosting Talk, SEO, Computer Technology : http://realtechtalk.com/articles
wget --no-directories --no-host-directories --recursive --no-parent --accept txt URL    Linux tricks : http://issarice.com/linux-tricks
wget --no-directories --output-file=wget.log --verbose --tries=5 --input-file=download.txt     wget & --exclude-directories : http://www.linuxquestions.org/questions/slackware-14/wget-and-exclude-directories-71713/
wget --no-directories --recursive --user= --password= --no-parent    Best practice for applying Program Temporary Fixes (PTFs) | Support | SUSE : http://www.suse.com/support/kb/doc/@id=7016640
wget --no-directories --retr-symlinks "    Get and build the development tools : http://epics.anl.gov/base/RTEMS/tutorial/node20.html