From: Alexey Smirnov on
On Mar 16, 6:11 am, "Mark B" <none...(a)none.com> wrote:
> Is that URL redirection via webconfig SEO friendly?
>
> Basically our concept is similar to a dictionary.com one where they have:
>
> dictionary.com/truck
> dictionary.com/trunk
> dictionary.com/try
>
> etc
>
> and each page is laden with the particular keyword. I thought the only way
> they did this was by creating separate pages for each.
>
> "Mike Lovell" <dont.re...(a)gotinker.com> wrote in message
>
> news:OG9Sr8LxKHA.4552(a)TK2MSFTNGP04.phx.gbl...
>
>
>
> >> Yeah but I need to save the aspx file onto the disk on the server so
> >> later someone else can go towww.domain.com/mypage102.aspx
>
> > Yes, you could just save that information using:
>
> > File.WriteAllText(filename, data);
>
> > In the System.IO Namespace.
>
> > It does depend on what you're trying to do though, you can carry out URL
> > redirection and things like this in your 'web.config' - Where you can have
> > different URL's correspond to a single page, which you can alter based on
> > which URL the browser called.
>
> > --
> > Mike
> > GoTinker, C# Blog
> >http://www.gotinker.com

Mark,

Don't be crazy about SEO-friendly URLs. A link like /page.aspx?
id=truck has the same meaning as a link like /truck.

If you definitely want to have "short" URLs, then you can either use
httpModules (google for "URL Rewriting"), or ASP.NET MVC. In both
cases the idea is not to create a new aspx page, but return an output
as it would be a new page.

Let me know if you have further questions regarding this

Hope this helps
From: Mark B on
So if in the robots.txt I had:

www.mysite.com/definitions/default.aspx?id=truck
www.mysite.com/definitions/default.aspx?id=trunk
www.mysite.com/definitions/default.aspx?id=try

they'd all be stored separately in Google? It would be nice if they did --
save us a lot of work and disk space.

So I would need to programmatically re-write the robots.txt whenever another
word was added to the database? Or would it suffice if my homepage had all
these links on (created programmatically)?






From: Patrice on
Hello,

Rather than throwing at us some ideas to achieve some unknown goal, could
you start by explaining what you are trying to do ?

For now, my understanding is that you would like to have "friendly" urls
which is done by using what is called "url rewriting". See for example :
http://msdn.microsoft.com/en-us/library/ms972974.aspx

The idea is that the request to a friendly url is intercepted and then your
url rewriting module directs transparently this request to an actual page
with possibly some url parts as query string parameters...

Also having the big picture could help to raise better suggestion. Do you
want to do this only for search engine or do you want also to actually use
those friendly urls on your site ? What is the benefit you are looking for ?


--
Patrice

"Mark B" <none123(a)none.com> a �crit dans le message de groupe de discussion
: #D4Sa6PxKHA.948(a)TK2MSFTNGP05.phx.gbl...
> So if in the robots.txt I had:
>
> www.mysite.com/definitions/default.aspx?id=truck
> www.mysite.com/definitions/default.aspx?id=trunk
> www.mysite.com/definitions/default.aspx?id=try
>
> they'd all be stored separately in Google? It would be nice if they did --
> save us a lot of work and disk space.
>
> So I would need to programmatically re-write the robots.txt whenever
> another word was added to the database? Or would it suffice if my homepage
> had all these links on (created programmatically)?
>
>
>
>
>
>
>

From: Patrice on
Ok, have you checked Google for webmaster tools ? AFAIK they provide you
with quite a bunch of tools including the ability to see how your site is
seen by the Google indexer and guides about best practices...

The key point here is to understand and measure how the change you made
impacts your site rather than doing random changes and have no way to find
out if it improved (or possibly damaged) your site ranking...

--
Patrice

"Mark B" <none123(a)none.com> a �crit dans le message de groupe de discussion
: e9CL6vnxKHA.5036(a)TK2MSFTNGP02.phx.gbl...
> So I now have:
>
> http://www.factorwords.com/definition/default.aspx?word=Blooter instead of
> http://www.factorwords.com/definition/blooter.aspx referenced at
> http://www.factorwords.com/ along the left.
>
> Hopefully Google will rank those OK.



From: Alexey Smirnov on
On Mar 16, 12:48 pm, "Mark B" <none...(a)none.com> wrote:
> So if in the robots.txt I had:
>
> www.mysite.com/definitions/default.aspx?id=truckwww.mysite.com/definitions/default.aspx?id=trunkwww.mysite.com/definitions/default.aspx?id=try
>
> they'd all be stored separately in Google? It would be nice if they did --  
> save us a lot of work and disk space.
>
> So I would need to programmatically re-write the robots.txt whenever another
> word was added to the database? Or would it suffice if my homepage had all
> these links on (created programmatically)?

The robots.txt file is used to define what content can be excluded by
search engine spiders. You don't need to define every single URL
there. To index all pages, you either should delete robots.txt or put
there just two following lines

User-agent: *
Disallow:

I think it would not be a problem if you enumerate all links in that
file, but I'm pretty sure that this will not help to increase any
ranking.