How to Block Google From Indexing a Website
- 1). Open Notepad or your preferred alternative text editor.
- 2). Type "User-agent: Googlebot" on the first line of a new text file. If you want to block all search engine robots from indexing your website, type "User-agent: *" instead. Press "Enter" to move to the next line.
- 3). Type "Disallow: /" on the second line of the text file.
- 4). Save the text file with the name "robots.txt."
- 5). Connect to your Web server using a File Transfer Protocol (FTP) program and upload the file "robots.txt" to the root directory of your website. The root directory is generally named "public_html."
- 6). Open a Web browser and type "www.example.com/robots.txt", where "example.com" is your website's domain name. You should see the text file that you created appear in the browser window, confirming that Google will no longer crawl or index your website.