From: sln on
On Sun, 11 Jul 2010 14:22:13 -0700 (PDT), Peng Yu <pengyu.ut(a)gmail.com> wrote:

>On Jul 10, 1:15�pm, "Peter J. Holzer" <hjp-usen...(a)hjp.at> wrote:
>> On 2010-07-10 17:29,PengYu<pengyu...(a)gmail.com> wrote:
>>
>> > Suppose I see a webpage on a website that has the following link
>> > encoded byjavascript, I want to user perl to parse such webpage and
>> > convert it to the actual url. Could you please let me know what
>> > package I should use? I thought that HTML package may not handle this.
>> > But please let me know if I'm wrong.
>>
>> ><a herf="javascript:some_command_return_a_url();">Link</a>
>>
>> I would start athttp://search.cpan.org/search?query=javascript&mode=all
>
>There are many results returned. Which one is the best one to help me
>solve my particular problem?

I'm a novice on the matter but it would seem to me that
javascript:some_command_return_a_url(); is embed and that when it
comes time to render the document, some_command_return_a_url() is
executed by a JS interp engine and assigned to that elements attribute
(href). But I'm not sure if thats the case and/or if "javascript:" is even
valid in this place (I'm too lazy to look it up at w3schools).

It would seem to me that such a package would have to be as powerfull as
a web browser with the ability to host a JS engine.
Because I don't know about these engines, I don't know if that engine
can actually call the underlying OS or that the host does, on its
behalf (sandbox and all that stuff).

It would be kind of cool to have your own back door browser that
rumages around web sites in real-time displaying a developer screen
that features a full featured debugger (like a m$ VS2008) for those
engines and html. Imagine, super automation, validation,
auto-created test cases, server hacking, scraping ..
The list is endless.

-sln