|
Joined: Mar 2001
Posts: 7,394
Admin / Code Breaker
|
Admin / Code Breaker
Joined: Mar 2001
Posts: 7,394 |
Modification Name: Spider Bait Author(s): Charles Capps Description: This modification creates spider-friendly URLs for your UBB, meaning spiders can begin crawling your UBB. Demo: https://www.ubbdev.com/ubbcgi/ultimatebb.cgi Requirements: UBB.classic™ 6.6.0, might work on older versions Download Link: http://mods.lkworld.com/spiderhack.txt Credits: Charles Capps Notes: Charles Capps is the only author for this mod, I'm only unofficially supporting it. Number of Downloads: [img] https://www.ubbdev.com/lk/num.php?s=spiderhack.txt[/img] Reference: http://www.google.com/search?q=site:www.ubbdev.com+ubbdev.com&hl=en&lr=&ie=UTF-8&start=40&sa=N&filter=0 to see how well it crawls UBBDev
|
|
|
|
Joined: Feb 2001
Posts: 2,285
Old Hand
|
Old Hand
Joined: Feb 2001
Posts: 2,285 |
I'll be installing this tonight.  Thanks CC, LK!
|
|
|
|
Joined: Mar 2000
Posts: 21,079 Likes: 3
I type Like navaho
|
I type Like navaho
Joined: Mar 2000
Posts: 21,079 Likes: 3 |
Thanks LK 
|
|
|
|
Joined: Jan 2002
Posts: 88
Member
|
Member
Joined: Jan 2002
Posts: 88 |
thnx alot man... i have installed it... and UBB is still working now i am really curious on what will happen to all the spiders.. let them all come to me... cheersz man .. (although too bad my compact headers are gone)
|
|
|
|
Joined: Jan 2002
Posts: 88
Member
|
Member
Joined: Jan 2002
Posts: 88 |
well i tried to figure out where the compact headers are... but i dont have any clue.. is there anyone that can help me on this.. 
|
|
|
|
Joined: Jan 2003
Posts: 3,456 Likes: 2
Master Hacker
|
Master Hacker
Joined: Jan 2003
Posts: 3,456 Likes: 2 |
quote: Originally posted by messagedj: well i tried to figure out where the compact headers are... but i dont have any clue..
is there anyone that can help me on this.. http://www.djwebpages.com/ *points* found them 
|
|
|
|
Joined: Jan 2002
Posts: 88
Member
|
Member
Joined: Jan 2002
Posts: 88 |
hmmzz... i think i do not get that Al..
can you (or someone else) explain what that *points* means?
|
|
|
|
Joined: Jan 2003
Posts: 3,456 Likes: 2
Master Hacker
|
Master Hacker
Joined: Jan 2003
Posts: 3,456 Likes: 2 |
 when I click that link I see the compact headers
|
|
|
|
Joined: Mar 2001
Posts: 7,394
Admin / Code Breaker
|
Admin / Code Breaker
Joined: Mar 2001
Posts: 7,394 |
lol, I also see the compact headers...
btw, check the compact headers instructions for ubb_forum_summary.cgi, it'll move "Welcome to our newest member" to the right-hand side..
|
|
|
|
Joined: Jan 2002
Posts: 88
Member
|
Member
Joined: Jan 2002
Posts: 88 |
|
|
|
|
Joined: May 2001
Posts: 1,042 Likes: 7
Moderator
|
Moderator
Joined: May 2001
Posts: 1,042 Likes: 7 |
Seems my server won't let me do it.  Running Apache 2.x, PHP 4.3.3, ActivePerl 5.8 on a Win XP box. Works fine when going to ultimatebb.cgi, but nadda for php. (It gives 404 errors) I'd rather not go into the httpd.conf the "ForceApplication xhttpd-php" thingy or whatever it is and mess it up further than i already have  . Is there any way i can modify the htaccess file to do this? *scratches head*
|
|
|
|
Joined: Mar 2001
Posts: 7,394
Admin / Code Breaker
|
Admin / Code Breaker
Joined: Mar 2001
Posts: 7,394 |
messagedude, it's not because fo this modification - I think when you hacked your UBB files to add it you accidently hacked and uploaded old files... also if you go to http://www.djwebpages.com/cgi-bin/2ubb/ultimatebb.cgi?ubb=get_topic&f=1&t=000096 you see there are no compact headers, so it's not because of the hack.. brett, that's odd.. nobody else's got that problem.. anyhow, what I added in httpd.conf is: LoadModule php4_module "f:/php/sapi/php4apache2.dll" AddType application/x-httpd-php .php try to add that to your conf instead of all the php things you've added so far..
|
|
|
|
Joined: Jan 2002
Posts: 88
Member
|
Member
Joined: Jan 2002
Posts: 88 |
LK> thats kinda strange...
maybe you are right, although i really do not understand why and how i did that.. anyhow.. i will let ya know when i have checked everything.
any idea how long it will take for the spiders to start crawling?..
|
|
|
|
Joined: Mar 2001
Posts: 7,394
Admin / Code Breaker
|
Admin / Code Breaker
Joined: Mar 2001
Posts: 7,394 |
Took UBBDev 4 days .. (Aug 28th - Sep 1st)
|
|
|
|
Joined: Feb 2001
Posts: 817
Moderator / Kingpin
|
Moderator / Kingpin
Joined: Feb 2001
Posts: 817 |
quote: --- Files Modified ---
CGI: ultimatebb.cgi, ubb_forum.cgi, ubb_lib_posting.cgi NonCGI: ultimatebb.php Templates: ubb_forum_summary.pl, ubb_forum_page.pl, ubb_topic_page.pl
FYI the Templates are: public_forum_summary.pl, public_forum_page.pl, public_topic_page.pl 
|
|
|
|
Joined: May 2001
Posts: 1,042 Likes: 7
Moderator
|
Moderator
Joined: May 2001
Posts: 1,042 Likes: 7 |
Thanks LK, that did the trick! What i had in there before was: ScriptAlias /php/ "c:/program files/apache group/php/" AddType application/x-httpd-php .php .php3 Action application/x-httpd-php "/php/php.exe" Thanks Much! 
|
|
|
|
Joined: Jan 2000
Posts: 5,833 Likes: 20
UBBDev / UBBWiki Owner Time Lord
|
UBBDev / UBBWiki Owner Time Lord
Joined: Jan 2000
Posts: 5,833 Likes: 20 |
make a .htaccess file with that in it lol...
|
|
|
|
Joined: Mar 2001
Posts: 7,394
Admin / Code Breaker
|
Admin / Code Breaker
Joined: Mar 2001
Posts: 7,394 |
geekydude, I'll change it as soon as I care  brett, yay 
|
|
|
|
Joined: Oct 2000
Posts: 2,667
Veteran
|
Veteran
Joined: Oct 2000
Posts: 2,667 |
what about content islands?
Do you believe in love at first sight, or should I walk by again?
|
|
|
|
Joined: Mar 2001
Posts: 7,394
Admin / Code Breaker
|
Admin / Code Breaker
Joined: Mar 2001
Posts: 7,394 |
what about them 
|
|
|
|
Joined: Oct 2000
Posts: 2,667
Veteran
|
Veteran
Joined: Oct 2000
Posts: 2,667 |
well can't we have them use short urls too? this way spider might crawl them if they are located on an other website 
Do you believe in love at first sight, or should I walk by again?
|
|
|
|
Joined: Jan 2000
Posts: 5,073
Admin Emeritus
|
Admin Emeritus
Joined: Jan 2000
Posts: 5,073 |
Most CIs are included via Javascript, which most spiders won't bother executing.
UBB.classic: Love it or hate it, it was mine.
|
|
|
|
Joined: Jan 2000
Posts: 5,833 Likes: 20
UBBDev / UBBWiki Owner Time Lord
|
UBBDev / UBBWiki Owner Time Lord
Joined: Jan 2000
Posts: 5,833 Likes: 20 |
unless you use an include 
|
|
|
|
Joined: Oct 2000
Posts: 2,667
Veteran
|
Veteran
Joined: Oct 2000
Posts: 2,667 |
does the spider thing applies to rss too ?
Do you believe in love at first sight, or should I walk by again?
|
|
|
|
Joined: Jan 2000
Posts: 5,073
Admin Emeritus
|
Admin Emeritus
Joined: Jan 2000
Posts: 5,073 |
Most spiders are only HTML-aware.
UBB.classic: Love it or hate it, it was mine.
|
|
|
|
Joined: May 2001
Posts: 283
Member
|
Member
Joined: May 2001
Posts: 283 |
Am I the only one that can't find: Templates: ubb_forum_summary.pl, ubb_forum_page.pl, ubb_topic_page.pl ? 
|
|
|
|
Joined: May 2001
Posts: 283
Member
|
Member
Joined: May 2001
Posts: 283 |
Seriously though, I ran into a problem. After I implemented the hack I cleared my cache and visited my forum summary page. From there, every link that I clicked on for a forum or post took me right back to the forum summary page. I couldn't get anywhere.
Any ideas why?
|
|
|
|
Joined: Jan 2000
Posts: 5,073
Admin Emeritus
|
Admin Emeritus
Joined: Jan 2000
Posts: 5,073 |
You probably haven't properly installed the code for ultimatebb.cgi and ultimatebb.php...
UBB.classic: Love it or hate it, it was mine.
|
|
|
|
Joined: May 2001
Posts: 283
Member
|
Member
Joined: May 2001
Posts: 283 |
I figured it out. I still had forums using JCTemplates. I fixed it and it's working now.
I went to google and re-submitted my site. Which URL should I give them though?
|
|
|
|
Joined: Jan 2000
Posts: 5,073
Admin Emeritus
|
Admin Emeritus
Joined: Jan 2000
Posts: 5,073 |
UBB.classic: Love it or hate it, it was mine.
|
|
|
|
Joined: Jan 2002
Posts: 88
Member
|
Member
Joined: Jan 2002
Posts: 88 |
and if you want to submit your UBB to even more search engines, then try this one http://www.ineedhits.com/add-it/free/ you only need to submit your details once, and you can submit your site to 30 search engines accross the net. cheersz!
|
|
|
|
Joined: Jan 2000
Posts: 5,833 Likes: 20
UBBDev / UBBWiki Owner Time Lord
|
UBBDev / UBBWiki Owner Time Lord
Joined: Jan 2000
Posts: 5,833 Likes: 20 |
Sweet, I'll definately look into it momentarily 
|
|
|
|
Joined: Jan 2002
Posts: 88
Member
|
Member
Joined: Jan 2002
Posts: 88 |
if you have installed this hack and you like it, but you dont want some sections of your UBB to be crawled then you can use the : Robots.txtWhen a Robot visits a Web site, say https://ubbdev.com/, it firsts checks for https://ubbdev.com/robots.txt. If it can find this document, it will analyze its contents to see if it is allowed to retrieve the document. You can customize the robots.txt file to apply only to specific robots, and to disallow access to specific directories or files. Here is a sample robots.txt file that prevents all robots from visiting the entire site User-agent: * # applies to all robots Disallow: / # disallow indexing of all pages The Robot will simply look for a "/robots.txt" URI on your server, where a site is defined as a HTTP server running on a particular host and port number. Here are some sample locations for robots.txt: Site URL for robots.txt https://ubbdev.com/ URL for robots.txt https://ubbdev.com/robots.txt Site URL for robots.txt https://ubbdev.com:80/ Site URL for robots.txt https://ubbdev.com:80/robots.txt
There can only be a single "/robots.txt" on a site. Specifically, you should not put "robots.txt" files in user directories, because a robot will never look at them. If you want your users to be able to create their own "robots.txt", you will need to merge them all into a single "/robots.txt". If you don't want to do this your users might want to use the Robots META Tag instead. Some tips: URL's are case-sensitive, and "/robots.txt" string must be all lower-case. Blank lines are not permitted within a single record in the "robots.txt" file. There must be exactly one "User-agent" field per record. The robot should be liberal in interpreting this field. A case-insensitive substring match of the name without version information is recommended. If the value is "*", the record describes the default access policy for any robot that has not matched any of the other records. It is not allowed to have multiple such records in the "/robots.txt" file. The "Disallow" field specifies a partial URI that is not to be visited. This can be a full path, or a partial path; any URI that starts with this value will not be retrieved. For example, Disallow: /help disallows both /help.html and /help/index.html, whereas Disallow: /help/ would disallow /help/index.html but allow /help.html. An empty value for "Disallow", indicates that all URIs can be retrieved. At least one "Disallow" field must be present in the robots.txt file. A good example of a robot.txt file is: # # robots.txt for http://www.djwebpages.com/ # # $Id: robots.txt,v 1.22 2003/09/11 20:23:04 ted Exp $ #
# For use by search.w3.org User-agent: W3Crobot/1 Disallow: /Out-Of-Date
# AltaVista Search User-agent: AltaVista Intranet V2.0 W3C Webreq Disallow: /Out-Of-Date
# exclude some access-controlled areas User-agent: * Disallow: /Images Disallow: /Privat Disallow: /cgi-bin/linkexchange/ Disallow: /Web Disallow: /History Disallow: /Out-Of-Date Disallow: /2003/09/mid Disallow: /People/all/ To create your own fast and secure robots.txt you can use Robotpack " - 100% Freeware, No Spyware, No Sponsorware and No Adware
- Create robot-exclusion files by selecting documents and directories.
- Log into FTP servers and upload ROBOTS.TXT from RobotPack.
- Manage Projects for multiple web site.
- RobotPack come with the Open Robots.txt Directory (ORD).
- Ability to edit the robot database and add additional user-agents.
Download Robotpack Note: This is not tested yet at UBB but i am almost 100% sure that it will work.
|
|
|
|
Joined: May 2001
Posts: 283
Member
|
Member
Joined: May 2001
Posts: 283 |
I've encountered a problem on my 6.5.0 board running this hack. In Today\'s Active Topics you cannot get to page 2, or any other page other than page 1. It just sends you back to page 1 over and over. Also, from the Forum Summary page, or from within one of the forums, you cannot access the category pages.
|
|
|
|
Joined: Jan 2000
Posts: 5,073
Admin Emeritus
|
Admin Emeritus
Joined: Jan 2000
Posts: 5,073 |
I am unable to duplicate the category problem... the TAT problem is actually a bug in the Accelerator, unrelated to the hack...
UBB.classic: Love it or hate it, it was mine.
|
|
|
|
Joined: May 2001
Posts: 283
Member
|
Member
Joined: May 2001
Posts: 283 |
And here's another problem this hack seems to have caused in 6.5.0: When not logged in, visiting the forums gives you nothing but white space (see below). ![[Linked Image]](http://www.catcherman.com/images/not_logged_in.gif)
|
|
|
|
Joined: Jan 2000
Posts: 5,073
Admin Emeritus
|
Admin Emeritus
Joined: Jan 2000
Posts: 5,073 |
The hack most certainly can not cause that. It's just replacing some URLs and injecting some new code...
UBB.classic: Love it or hate it, it was mine.
|
|
|
|
Joined: May 2001
Posts: 283
Member
|
Member
Joined: May 2001
Posts: 283 |
I believe it was the part about activating the accelerator. Something was screwy with it. I just turned it off and back on, then refreshed the board cache (in the CP), and now it seems to work when logged out. Gremlins I tell ya! [edit below] When a visitor tries to visit: http://www.catcherman.com/ubb/ultimatebb.php/ubb/forum/f/1 is when the screenshot above happens. I can't figure out what's causing the problem.
|
|
|
|
Joined: May 2001
Posts: 283
Member
|
Member
Joined: May 2001
Posts: 283 |
What happens if you run this hack without activating the UBB Accelerator?
|
|
|
Donate to UBBDev today to help aid in Operational, Server and Script Maintenance, and Development costs.
Please also see our parent organization VNC Web Services if you're in the need of a new UBB.threads Install or Upgrade, Site/Server Migrations, or Security and Coding Services.
|
|
Posts: 417
Joined: November 2001
|
|
Forums63
Topics37,575
Posts293,930
Members13,823
|
Most Online6,139 Sep 21st, 2024
|
|
Currently Online
Topics Created
Posts Made
Users Online
Birthdays
|
|
|
|