![]() |
Argh, my bad. Was multi-tasking that I didn't even notice that the PHP files were completely different. Line 64 of functions_autotaggerfromcontentandtitle.php is:
Code:
function contains($text, $tag, $related_words) { https://vborg.vbsupport.ru/showthread.php?t=234026 Seems to be conflicting with your system. |
Quote:
Please download and install the current version (1.2a) and let me know if this fixes the issue on your end. :D |
Quote:
|
Awesome, thanks! The system works perfectly now, and I love that the words are included now. I do have a suggestion though if you don't mind. I noticed that the system is using the *.php file in order to make the full report, eg, what text is being used.
Is there a way that a template & Languages & Phrases can be used instead? So the text itself (the whole report) can be adjusted to fit the forums without changing the php file(s). And can the Postcount Threshold also be -1? (checking every single member, regardless of the postcount). Again, thanks for this system. It's very useful. --- EDIT: Maybe exclude specific sections/sub-sections would be nice as well ^_^. |
Quote:
|
:) appreciate it!
Mind if I make another suggestion? :P Is there anyway that the bad word filter can have a system that "must" include a specific word, and also report the rest of it? Hard to explain, so I'll give an example: Suppose, you would filter "http://google.com". <- yes, this entire url. That would report all posts/threads & signatures with "http://google.com" in it, but ignores anything beyond that if it includes a longer url, eg http://google.com/searchresulthere. When that happens, the bad word is not being reported at all, seeing the system only sees "http://google.com" as a bad word. Anything added after the link, well.. you get the drill. What I'd personally like to see is if it is possible that when you filter a specific word or URL in this case, that anything beyond that ( http://google.com/*) <- like this *, gets reported as well. I've seen on some plugins they are using such method, I just forgot what the plugins were/are called. I think the other systems were using Quotation Marks for this kind of job, eg as bad words. test1,test2,test3,"http://google.com",test4,test5 So that http://google.com and anything beyond that url gets reported as well. Hard to explain, but I think you understand. The reason why I'm suggesting is this because we don't want the word itself to be filtered, but the url and anything that is being added inside that url. This is to prevent words being used of the url in the filter. |
As it is, if you enter http://google.com as a flagged word, then if a new user has https://www.google.com/search?q=gauss&ie=utf-8&oe=utf-8 in their post, the post will be reported. Are you saying you would want the option that this not to be reported?
|
Odd, I'd have to try it out again then. I attempted http://google.com to be as a bad word, but when I enter a full google URL as your post, it didn't report it. Let me double check to see if this is still the case.
|
Okay, I see the issue...if http://google.com is entered as the flagged word, then https://www.google.com* will not be reported...what would have to be entered as the flagged word is https://www.google.com.
|
I think you noticed my issue now :D, though I might as well give an example with the URL I had in mind.
I've filtered: http://steamrep.com/ Now, when I make new thread / post with this included: http://steamrep.com/search?q=skyrider The system is not reporting it, while http://steamrep.com is in the bad word list. Hence, would it be possible that that the system can detect the bad word seeing "http://steamrep.com" is already in the bad word list. |
All times are GMT. The time now is 05:58 PM. |
Powered by vBulletin® Version 3.8.12 by vBS
Copyright ©2000 - 2025, vBulletin Solutions Inc.
X vBulletin 3.8.12 by vBS Debug Information | |
---|---|
|
|
![]() |
|
Template Usage:
Phrase Groups Available:
|
Included Files:
Hooks Called:
|