You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
How exactly is requestcontrol different from tools specializing in url cleaning like CleanUrls (apart from being more versatile that is)? requestcontrol does not seem to be filtering gclid and dclid (google ad tracking parameters) which CleanUrls seems to have no problem doing. I found this tool because CleanUrls is not flexible enough for my needs, however requestcontrol seems to have lot less filters compared to other url cleaner tools. Is it by design (blocking, filtering expected to be done manually by the user?) or are the filter lists work-in-progress? Also can it clean etag headers?
The text was updated successfully, but these errors were encountered:
I'd like to know this as well. I use Clearurls and would like to switch to Request Control, but I don't know if I need to complement it with other addons to achieve the same results (e.g. to block eTag tracking).
How cares about etags?
No RC can not clean etag headers. And it should not get such a feature IMHO. RC can manipulate URLs in a very powerful way and that's all it should do. Manipulating headers can be done in a other extension (e.g. Header Editor).
Unix philosophy:
Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".
Is it by design (blocking, filtering expected to be done manually by the user?) or are the filter lists work-in-progress?
How exactly is requestcontrol different from tools specializing in url cleaning like CleanUrls (apart from being more versatile that is)? requestcontrol does not seem to be filtering gclid and dclid (google ad tracking parameters) which CleanUrls seems to have no problem doing. I found this tool because CleanUrls is not flexible enough for my needs, however requestcontrol seems to have lot less filters compared to other url cleaner tools. Is it by design (blocking, filtering expected to be done manually by the user?) or are the filter lists work-in-progress? Also can it clean etag headers?
The text was updated successfully, but these errors were encountered: