Google(TM) bot access for registered users' pages

This plugin allows to spiders and robots to access the pages of the site reserved to the 'Registered' users.

Sometimes you have to protect interesting contents to get users' registration for commercial purposes or simply to create a community. But if content are not accessible, how can users know about their existance?

With this plugin the search engine can index these pages and bring to your site more visitors.

You can define an user for every search engine, and using the joomla 2.5/3.x ACL, you can define which pages are readable by each robots and by registered users and the pages which only registered users can read; or you can simply let spiders read all the pages.


Search engine robots are recognized and are automatically logged in, as a specific Joomla! user, so they are allowed to read the content reserved to that user or group.

This way contents are indexed, but no cache copy is made due the 'noarchive' meta tag, so an user can find the pages on the search engine, but he has to register to see the content of the page, because a normal visitor will not be logged in and he is redirect to the login/registration page!



Without ip check, an advanced user can easily impersonate the bot of a search engine, so don't use this plugin to protect very confidential informations.
Of course confidential info should not be exposed on search engines, anyway! Ip check feature will be released in a future version.


Known Bugs and Limitations

  1. not yet 'expert user' proof.
  2. noarchive meta is set for all pages, not only for reserved ones.


To Do (not in execution order)

  1. Content type plugin for setting 'noarchive' meta header only for reserved pages.
  2. IP check for bots recognition.
  3. Translation :)
  4. Random password on plugin installation.
  5. Selection of user group for user creation.


Parameters configuration

Basic configuration

  • Search bots | Joomla user:
    Search engine bot's name, as it appears in user agent string, and related user; pipe separated, one per line.
    The plugin comes with three preconfigured bots and users, and two dummy bots with a generic user. You can safely remove the dummy boots.
    DO NOT use the full user agent string unless you really know what you are doing. use only ansi characters for bot's name and user name to avoid mismatch in name comparison. If bot's name is not present in this list, the search engine will not be recognized.
    For a better handling of user follow this naming convention: 'SE' + 'bot name'+ 'bot'; no spaces or non ansi characters in bot's name.
  • Use generic user:
    Use 'SEGenericBot' Joomla user for all search engines bots. This is good for most uses.
  • Check if user exists:
    the plugin checks if user exists and creates it if needed.
  • Joomla! users password:
    Password for impersonate Joomla! users. This password MUST ALWAYS match the password in Joomla! user account, so if you change the password after user creation you have to update it in all users in Joomla! user management.
  • Set noarchive meta:
    Set the noarchive value for robots meta to avoid the creation of cache copy. IF YOU DISABLE IT YOU MUST PROVIDE THE META IN ANOTHER WAY!
  • Send Email Alert:
    Send an alert mail when bot access is allowed.
  • Send mail to:
    Mail address to notify bot access on authentication. If blank, mail is sent to site mail.

Advanced configuration

  • Debug Mode:
    Enable debug system message, use only when testing! For testing you can use an user agent switcher for your browser (doesn't work with ip match)
  • Authenticate bots on IP:
    NOT YET IMPLEMENTED! Ip of bot MUST match one of the following
  • Search bots | Allowed ip:
    NOT YET IMPLEMENTED! Search bots identification name and allowed ip or network (wildcard allowed:*); pipe separed, one per line.


Cloaking is not a good idea

This plugin can show different contents to humans and search engines. Using Joomla! 2.5+ ACL you can create different pages for every bot, and if you are just a bit able to hack the code, you can redirect the humans to specific pages, but let me say this is not a good idea. Cloaking is a non correct, non ethical way to advance in SERP. Your site could be banned for this.



Joomla 2.5 & 3.x




0 #5 stranakrovi 2023-05-10 20:56
Great extention. Can you please release the same for Joomla 4? Thanks.
+1 #4 newtojoomla 2015-07-09 15:50
Hi, Marco, just wanted to follow up on my last post in case you haven't had a chance to check. Thanks in advance for your support!

post with email address if you wanna be sure to get a reply...
+2 #3 newtojoomla 2015-07-07 17:56
Hi, Marco,

Was hoping this could help index my new membership only website. Tried installing this on localhost just to understand how the configuration works but found that it isn't working as I thought it would. Could you please help?

1) User accounts for bots were not set up after plugin is Enabled. Is there anything I need to do?
2) Password: Do I need to create a password manually before enabling the plugin or should I simply copy the password from Users Management to the plugin once the accounts are set up?

Many thanks for your support!

Hi Jensen,

the user will be created only after a match with a specific user agent! so, you should use a browser plugin to impersonate the search engine bot before you can see the plugin working.

better if you insert a new password (in the plugin configuration) before the users' creation. the users will be created by the plugin as needed.

0 #2 K.Maidment 2015-01-29 23:49
Doesn't work Google Mobile-Friendly test says there are 21 resources that are blocked. I double checked with Google robots.txt Tester and found results are the same as if this Plug-in is not there and yet it is enabled and set up as per your instructions. Goggle Mobile-Friendly tester says it can not render my site and this will seriously effect my SEO.

Hi K,
you have to add the new crawler's signature in the plugin configuration as explained in 'basic configuration'.
Anyway I suppose you had a wrong configuration in joomla's ACL, because google uses always the 'googlebot' key in user agent string.

0 #1 Guest 2013-12-18 01:17

Aggiungi commento

Please note: URL in text are not linked and user's site address is only for internal use and is not published.

Comments are human checked. All spam will be removed, so don't waste your time and, especially, mine!

Codice di sicurezza