I have a strange requirement!
I want to create a website that is NOT found by the search engines!
Basically, I have a large number of photographs on an unusual subject that I want to submit to a number of publishers in the hope of getting someone to publish them as a book. Rather than print them all individually and send them all off, I thought it would be a good idea to create a website with all the photos and text that I have written, and to just give the editors the website address so that they can view them online. (For example, I have found out that Thames and Hudson don't want hardcopies, they want book proposals to be sent by email with no attachments, and they will then contact you if they are interested in your idea).
I realise that nothing is totally 100% secure on the internet, but I would rather that my webpage doesn't get found by accident by someone via a search engine.
Does anyone have any suggestions? Also, does anyone have any tips or experience of how to get a book of photographs published? (I realise that this is a bit off subject on this particular forum - but hey, someone may know!)
Search Engine "Minimisation"!!!
- Normandy Cow
- Posts: 2687
- Joined: Sun Nov 28, 2004 7:14 am
- Location: Normandy
- Contact:
Catherine. If I've wanted to do that I have added the pages to my website but not made the pages navigable from the main site. I did this just for fun during our recent renovations to let friends see the progress. (or lack of at times....) http://www.lilycottagebelford.co.uk/diy.htm I just posted that link. I used pages like that to test stuff out like the mapquest thing at the bottom. I'm not sure if it will be picked up by search engines eventually though? I wonder if you just do the opposite things to if you want to be found?
Another thing is get an account with photobox.co.uk or other online photo storage. You can choose to make the account public or private or shared and "invite" people to the page.
Good luck!
Another thing is get an account with photobox.co.uk or other online photo storage. You can choose to make the account public or private or shared and "invite" people to the page.
Good luck!
- Normandy Cow
- Posts: 2687
- Joined: Sun Nov 28, 2004 7:14 am
- Location: Normandy
- Contact:
Thanks Sue. I did start to do this on flickr.com, but I think that when you "invite" people to the page, they still have to fill in a registration form before they can view your stuff, and I wanted to have the minimum number of barriers in their way. And as it is no problem for me to knock up a simple website, I thought it might be better to do it that way...soodyer wrote:Another thing is get an account with photobox.co.uk or other online photo storage. You can choose to make the account public or private or shared and "invite" people to the page.
You can do this, as Sue says, by having a page not linked to from anywhere else. And to make sure search engines don't index it you can put a line in the html telling robots to stay away. I can't remember exactly how, but I'm sure our technical friends will have the details on this.
Paolo
Lay My Hat
Lay My Hat
- thisfrenchlife
- Posts: 106
- Joined: Sat Jan 08, 2005 3:28 pm
- Contact:
This will work if placed within the header of you site:
<meta name="robots" content="noindex,nofollow">
It means search engines won't record the site, or follow any links off it to other sites.
All the best
Craig
<meta name="robots" content="noindex,nofollow">
It means search engines won't record the site, or follow any links off it to other sites.
All the best
Craig
This French Life
http://www.thisfrenchlife.com/
http://www.thisfrenchlife.com/
- Normandy Cow
- Posts: 2687
- Joined: Sun Nov 28, 2004 7:14 am
- Location: Normandy
- Contact:
Craig, could I impose upon you to help with this, too, please?
I have a number of pages that I don't want crawled (payment links, test pages, etc) and I've put them all in one directory called "secure".
Rather than inserting a metatag on each page (which I would need to remove when I move files to a public directory), can I protect the entire secure directory from crawling with a text file in my root directory like this:
Thanks for any/all advice!
debk
I have a number of pages that I don't want crawled (payment links, test pages, etc) and I've put them all in one directory called "secure".
Rather than inserting a metatag on each page (which I would need to remove when I move files to a public directory), can I protect the entire secure directory from crawling with a text file in my root directory like this:
Does this do the same thing as the meta one-liner in each page's header?filename: robots.txt
User-agent: *
Disallow: /secure/
Thanks for any/all advice!
debk
- dmjarvis
- Posts: 49
- Joined: Wed Sep 28, 2005 9:14 am
- Location: Missillac, Loire-Atlantique (44), France
- Contact:
Using Meta and Robots to stop Search Engine Robots
I think you will find that coding meta rules and robot files does not ban search engines from spidering your page(s), it merely advises them that you don't want them to be indexed. This is subtely different, as it will not stop the less honest spiders from doing it. In fact, if you were writing a spider program to go look for juicy information on the web wouldn't you be attracted to just the sort of page that owners wanted you to avoid?
I agree that this will stop it appearing in mainstream search engines though, and should be used at least as a partial solution. I might be inclined to password protect the folder or page and then supply the password to the intended recipients.
Best Regards, Martin
I agree that this will stop it appearing in mainstream search engines though, and should be used at least as a partial solution. I might be inclined to password protect the folder or page and then supply the password to the intended recipients.
Best Regards, Martin