From: paris2venice on
Can one list a remote web directory with curl? I did this (as per the
man page)-
curl -l http://releases.mozilla.org//pub/mozilla.org/firefox/releases/latest/mac/en-US
but got a 301 error. ftp allows me to list it though.
From: Stephane CHAZELAS on
2009-12-21, 02:13(-08), paris2venice:
> Can one list a remote web directory with curl? I did this (as per the
> man page)-
> curl -l http://releases.mozilla.org//pub/mozilla.org/firefox/releases/latest/mac/en-US
> but got a 301 error. ftp allows me to list it though.

-l is for FTP listings, it has no influence on HTTP. There's no
such thing as a directory listing command in HTTP, all you get
is server generated HTML index pages when accessing a directory
that doesn't have one of its own.

Use -L to follow the redirect:
curl -L http://releases.mozilla.org//pub/mozilla.org/firefox/releases/latest/mac/en-US

Or avoid the redirection:
curl http://releases.mozilla.org//pub/mozilla.org/firefox/releases/latest/mac/en-US/

--
St�phane
From: paris2venice on
On Dec 21, 5:11 am, Stephane CHAZELAS <stephane_chaze...(a)yahoo.fr>
wrote:
> 2009-12-21, 02:13(-08), paris2venice:
>
> > Can one list a remote web directory with curl?  I did this (as per the
> > man page)-
> > curl -lhttp://releases.mozilla.org//pub/mozilla.org/firefox/releases/latest/...
> > but got a 301 error.  ftp allows me to list it though.
>
> -l is for FTP listings, it has no influence on HTTP. There's no
> such thing as a directory listing command in HTTP, all you get
> is server generated HTML index pages when accessing a directory
> that doesn't have one of its own.
>
> Use -L to follow the redirect:
> curl -Lhttp://releases.mozilla.org//pub/mozilla.org/firefox/releases/latest/...
>
> Or avoid the redirection:
> curlhttp://releases.mozilla.org//pub/mozilla.org/firefox/releases/latest/....
>
> --
> Stéphane

thanks but I'm still having difficulty. I remove all the html. That
works. I extract the filename. That works. But when I put it all
together, it doesn't download and I get a file not found.
The filename contains a space (Firefox 3.5.6.dmg) but it would be nice
to not even have to know this since the whole point of this is not
knowing the current version of Firefox but still DL it using bash.

#!/bin/bash
url="http://releases.mozilla.org//pub/mozilla.org/firefox/releases/
latest/mac/en-US"
line=$( curl -sL "$url" | sed -e 's/<[^>]*>/ /g' | grep "Firefox.*dmg
" )
file=$( echo "$line" | awk '{printf "%s %s\n",$1,$2}' )
curl -# -C - -L -o Firefox.dmg "$url/$file"

From: Stephane CHAZELAS on
2009-12-22, 04:34(-08), paris2venice:
[...]
> thanks but I'm still having difficulty. I remove all the html. That
> works. I extract the filename. That works. But when I put it all
> together, it doesn't download and I get a file not found.
> The filename contains a space (Firefox 3.5.6.dmg) but it would be nice
> to not even have to know this since the whole point of this is not
> knowing the current version of Firefox but still DL it using bash.
>
> #!/bin/bash
> url="http://releases.mozilla.org//pub/mozilla.org/firefox/releases/
> latest/mac/en-US"
> line=$( curl -sL "$url" | sed -e 's/<[^>]*>/ /g' | grep "Firefox.*dmg
> " )
> file=$( echo "$line" | awk '{printf "%s %s\n",$1,$2}' )
> curl -# -C - -L -o Firefox.dmg "$url/$file"

Get the file name from the href target instead of the display
string, that'll save you from doing the escaping yourself:

#! /bin/sh -
url="http://releases.mozilla.org//pub/mozilla.org/firefox/releases/latest/mac/en-US"
file=$(curl -sL "$url" | sed '/.*href="\(Firefox[^"]*dmg\)".*/!d;s//\1/;q')
[ -n "$file" ] && curl -# -L -o Firefox.dmg "$url/$file"


--
St�phane
From: mik3 on
On Dec 21, 6:13 pm, paris2venice <paris2ven...(a)gmail.com> wrote:
> Can one list a remote web directory with curl?  I did this (as per the
> man page)-
> curl -lhttp://releases.mozilla.org//pub/mozilla.org/firefox/releases/latest/...
> but got a 301 error.  ftp allows me to list it though.


url="http://releases.mozilla.org//pub/mozilla.org/firefox/releases/
latest/mac/en-US/"
download=$(curl -sL "$url" | awk -v url="$url" 'BEGIN{RS="</a>"}
/Firefox.*dmg\042/{
gsub(/.*a href=\042|\042>.*/,"")
print url $0
}')
curl <options> "$download"
 | 
Pages: 1
Prev: configure MySQL on Leopard
Next: Remove last lines