From: srikanth on
On Jun 8, 8:29 pm, Bit Twister <BitTwis...(a)mouse-potato.com> wrote:
> On Tue, 8 Jun 2010 07:39:30 -0700 (PDT), srikanth wrote:
> > Twister,
> > can i do one thing here. I will start the function in background and
> > do the process when first url processes then it will kill the browser
> > then again loop will open browser with next set of urls. Is there any
> > way to open in same window instead of killing it and opening it again.?
>
> Only kludges I can think of is find some application used to
> regression test gui applications. I know it would have to feed input
> into gui application to test it's functionally. No idea if "expect"
> could be used to feed urls to browser.
>
> As far as I am concerned you are using the wrong tool for what you are
> trying to do. If me, I would use wget or html2text, depending on site.
> I use those to fetch web pages to scan for updates to applications I
> have downloaded from that site.
>
> Snippet examples follow:
>
> _exe=$(basename $0)
> _ref_dir=/local/data/$_exe
> pick_msg=(" ")
>
> function compare_lst_ref
> {
>     cmp -s $_lst_fn $_ref_fn
>     if [ $? -ne 0 ] ; then
>       echo "
>       ww ; diff -bBw $_ref_fn $_lst_fn
>
>       \cp  $_lst_fn $_ref_fn
>
>       Get new $_app at $_url
>
>       Download from $_download"  > $_tmp_fn
>
>       printf "%s\n" "${pick_msg[@]}"  >> $_tmp_fn
>
>       echo "
>       This message is from /local/bin/$_exe
>       "  >> $_tmp_fn
>       mail -s "$_app From $_exe" $USER < $_tmp_fn
>       /bin/rm $_tmp_fn
>     fi
>
> } # end compare_lst_ref
>
>         #****************************************************
>         #* check for virtual box update
>         #****************************************************
>
> _app=virtbox
> _url='http://www.virtualbox.org/wiki/Linux_Downloads'
> _download="$_url"
> _wgt_fn=$TMPDIR/Linux_Downloads
> _lst_fn=$TMPDIR/${_app}.lst
> _ref_fn=$_ref_dir/Linux_Downloads_${_app}.ref
> pick_msg=("Pick .i586.rpm")
>
> rm -f ${_wgt_fn}*
> wget $_url -o /dev/null
> grep -i Mandriva  $_wgt_fn > $_lst_fn
> grep -i 'All distributions' $_wgt_fn >> $_lst_fn
> compare_lst_ref
>
>         #****************************************************
>         #* check for java jre update
>         #****************************************************
>
> _app=jre
> _url='http://java.sun.com/javase/downloads/index.jsp'
> _download=$_url
> _lst_fn=$TMPDIR/${_app}.lst
> _ref_fn=$_ref_dir/${_app}.ref
> pick_msg=("Pick JRE and i586-rpm.bin")
>
> html2text -nobs -style pretty -width 132 $_url | grep "JRE" > $_lst_fn
> compare_lst_ref

I have just used it because of it is very easy to use that command to
open the URL in default user set web browser. All of that need to fix
is.. Either it should open each url one by one in same window or
relaunch the browser for each URL to process.

For relaunch the browser for each URL to process my thought is just
open the browser in background and get the url to process. Once it is
done kill that process and relaunch the browser again to process next
url. Is there a way to do it twister?


From: Bit Twister on
On Tue, 8 Jun 2010 10:10:48 -0700 (PDT), srikanth wrote:

> For relaunch the browser for each URL to process my thought is just
> open the browser in background and get the url to process. Once it is
> done kill that process and relaunch the browser again to process next
> url. Is there a way to do it twister?

man pkill would suggest something like
pkill -u $USER xdg-open
would work.

From: srikanth on
On Jun 8, 6:34 pm, Bit Twister <BitTwis...(a)mouse-potato.com> wrote:
> On Tue, 8 Jun 2010 02:14:28 -0700 (PDT), srikanth wrote:
>
> > It is showing only
> > Browsing URLs.........\
>
> Heheh, without nothing being processed, it spun the wheel so fast
> you never noticed much.
>
> > My input text file contains 600+ urls. Does the progress bar will show
> > until the script process all of my URLs?
>
> Well, yes an no. You have redirected all output from your loop into
> /tmp/output.text. Even with my change, progress indicator goes into
> /tmp/output.text and no progress is shown on screen.
> Suggestion with some readability changes:
>
>     printf "Browsing URLs."
>
>     for j in $(cat $1)
>     do
>       echo "$j - $(HEAD -d $j)" > /tmp/output.text 2>&1
>       show_progress
>     done

The progress bar is showing very slowly. Also when the script was done
it is showing like 'Browsing URL|' why it is pretending | symbol after
completion. I have commented for printf "\b.|" but still is showing.
After completion of the script it should show only "Browsing URLs."
From: srikanth on
On Jun 8, 10:24 pm, Bit Twister <BitTwis...(a)mouse-potato.com> wrote:
> On Tue, 8 Jun 2010 10:10:48 -0700 (PDT), srikanth wrote:
> > For relaunch the browser for each URL to process my thought is just
> > open the browser in background and get the url to process. Once it is
> > done kill that process and relaunch the browser again to process next
> > url. Is there a way to do it twister?
>
> man pkill     would suggest something like
>     pkill -u $USER xdg-open
> would work.

#!/bin/bash -x
if [ -z "$1" ]
then
printf "Provide input text file to process the URLs\n"
exit 0
fi
#Kill the current running browser session
pkill -u $USER chrome
{
for i in `cat $1`
do
echo "`xdg-open $i`"
pkill -u $USER chrome
done
}
exit 0

Here it is not killing the browser after first url processed. I need
to manually close the browser to process for loop.
any thing wrong in my script.?
From: Bit Twister on
On Wed, 9 Jun 2010 07:34:31 -0700 (PDT), srikanth wrote:
> On Jun 8, 10:24 pm, Bit Twister <BitTwis...(a)mouse-potato.com> wrote:
>>
>>     pkill -u $USER xdg-open
>
> pkill -u $USER chrome

> Here it is not killing the browser after first url processed.

So what is different with my suggestion and your code.

> I need
> to manually close the browser to process for loop.
> any thing wrong in my script.?

Go back and read the man page for pkill.

Note syntax at top of man page.
Then scroll down and read the OPERANDS section.