View Single Post
fpp's Avatar
Posts: 2,853 | Thanked: 968 times | Joined on Nov 2005
#147
Originally Posted by tso View Post
that it happened gradually over time made me look for a file that would accumulate data and survive a update of packages.
what i found was .webkitcookies. and after nuking that, webkit was blazingly fast again!!!
Tso, you are a hero ! Like you I was wondering about something cumulative, but thinking more of the cache, as it already caused problems when it grew large in old PC versions of Netscape, a long time ago...

My cookie store was over 300KB, mostly ads and tracking cookies of course... once trimmed back to only the useful ones, it is now down to 14K... and my Tear is back to its original speed. Thanks for thinking of this ! I'm glad I posted about my "specific" problem after all :-)

yes people, that means one have to log back into forums and similar...
I do have quite a lot of logins stored in there, so here's what I did :

In xterm, I loaded .webkitcookies in vi and manually deleted all lines not related to domains I log in to. This took less time than it sounds, as the fluff tends to group together, and vi makes it easy : "dd" to delete the first line, then "." (dot) for all the others, then save (:wq or ZZ).

The resulting file I copied to another named .webkitcookies_logins. This way, when the cookie jar grows too large again, I'll just copy the saved one back over it (I very seldom add new logins).

The proper way, of course, would be to write a "cleansing" script as daperl suggested, as most of the fluff comes from a handful of domains.

The surprising thing is that images, small or large, are now loading very quickly again in Tear. I don't get the connection with cookies - is this a webkit bug or what ?...
 

The Following 2 Users Say Thank You to fpp For This Useful Post: