Active Topics

 



Notices


Reply
Thread Tools
daperl's Avatar
Posts: 2,427 | Thanked: 2,986 times | Joined on Dec 2007
#141
Originally Posted by tso View Post
that it happened gradually over time made me look for a file that would accumulate data and survive a update of packages.

what i found was .webkitcookies. and after nuking that, webkit was blazingly fast again!!!

yes people, that means one have to log back into forums and similar...

If you didn't already know, it's just a text file. In the future, instead of blowing it away you could save some login time by putting something like the following in a script:

Code:
cp ~/.webkitcookies ~/.webkitcookies.old
egrep -i '^[^[:space:]]*(maemo\.org|internettablettalk\.com|google\.com|yahoo\.com)[[:space:]][[:space:]]*(TRUE|FALSE)[[:space:]]' ~/.webkitcookies.old > ~/.webkitcookies
rm ~/.webkitcookies.old
Or you could use it just to clean stuff up.

EDIT:

Here's the inverse:

Code:
cp ~/.webkitcookies ~/.webkitcookies.old
egrep -v -i '^[^[:space:]]*(interclick\.com|advertising\.com|doubleclick\.com|specificclick\.com)[[:space:]][[:space:]]*(TRUE|FALSE)[[:space:]]' ~/.webkitcookies.old > ~/.webkitcookies
rm ~/.webkitcookies.old
__________________
N9: Go white or go home

Last edited by daperl; 2009-05-03 at 04:16.
 

The Following 2 Users Say Thank You to daperl For This Useful Post:
tso's Avatar
Posts: 4,783 | Thanked: 1,253 times | Joined on Aug 2007 @ norway
#142
meh, i do not have that many pages to log into

still, this time round it was mostly as a quick and dirty test. next rime i may try your suggestion.
 
daperl's Avatar
Posts: 2,427 | Thanked: 2,986 times | Joined on Dec 2007
#143
Originally Posted by tso View Post
meh, i do not have that many pages to log into

still, this time round it was mostly as a quick and dirty test. next rime i may try your suggestion.

Yeah, I'm just trying to keep some of my chops up, and I appreciate your diligent bug tracking. So, some combination there of is why I posted.
__________________
N9: Go white or go home
 
Bundyo's Avatar
Posts: 4,708 | Thanked: 4,649 times | Joined on Oct 2007 @ Bulgaria
#144
Bah, the cookies hack bites back

Hmm, I wonder if that can be changed.

Btw, some more bad news - libsoup 2.4 can't compile for Maemo 4, because it requires GLib 2.15 (and specifically GIO). If someone has any ideas or a patch for a GIO-less libsoup, be welcome Good news is that libsoup 2.4 is included in Fremantle and probably Mer.
__________________
Technically, there are three determinate states the cat could be in: Alive, Dead, and Bloody Furious.

Last edited by Bundyo; 2009-05-03 at 08:27.
 
Posts: 1,224 | Thanked: 1,763 times | Joined on Jul 2007
#145
I compiled libsoup on maemo4 by:

glib-2.16.6:
./configure --prefix=$HOME/test --exec-prefix=$HOME/test --enable-static
make install

libsoup-2.4.1:

export PKG_CONFIG_PATH=$HOME/test/lib/pkgconfig/:/usr/lib/pkgconfig/
export LD_LIBRARY_PATH=$HOME/test/lib:$LD_LIBRARY_PATH
./configure
make

To get it to work on the tablet, I guess that you need to copy the libraries from $HOME/test/lib and libsoup/.libs to some directory on the tablet and add the directory to LD_LIBRARY_PATH before running tear.

If you get it running, it should not be hard to package all this so that it is easy for users to install. Or you can link tear with the static glib-2.16.6.
 

The Following 3 Users Say Thank You to Matan For This Useful Post:
Bundyo's Avatar
Posts: 4,708 | Thanked: 4,649 times | Joined on Oct 2007 @ Bulgaria
#146
I compiled GLib 2.18 from Fremantle and libsoup with it, but if I link libsoup statically with GLIb, I have to link the same GLib with libwebkit (which libsoup interface is also using GIO) and it will become one big hack at the end and impossible to get into extras-devel. I was thinking to statically link GLib and libsoup inside libwebkit but I want to link only those two and dunno if that's possible too.

I also compiled Curl 7.18.2, but the pipelining option doesn't have much effect. I'm still experimenting though.
__________________
Technically, there are three determinate states the cat could be in: Alive, Dead, and Bloody Furious.
 
fpp's Avatar
Posts: 2,853 | Thanked: 968 times | Joined on Nov 2005
#147
Originally Posted by tso View Post
that it happened gradually over time made me look for a file that would accumulate data and survive a update of packages.
what i found was .webkitcookies. and after nuking that, webkit was blazingly fast again!!!
Tso, you are a hero ! Like you I was wondering about something cumulative, but thinking more of the cache, as it already caused problems when it grew large in old PC versions of Netscape, a long time ago...

My cookie store was over 300KB, mostly ads and tracking cookies of course... once trimmed back to only the useful ones, it is now down to 14K... and my Tear is back to its original speed. Thanks for thinking of this ! I'm glad I posted about my "specific" problem after all :-)

yes people, that means one have to log back into forums and similar...
I do have quite a lot of logins stored in there, so here's what I did :

In xterm, I loaded .webkitcookies in vi and manually deleted all lines not related to domains I log in to. This took less time than it sounds, as the fluff tends to group together, and vi makes it easy : "dd" to delete the first line, then "." (dot) for all the others, then save (:wq or ZZ).

The resulting file I copied to another named .webkitcookies_logins. This way, when the cookie jar grows too large again, I'll just copy the saved one back over it (I very seldom add new logins).

The proper way, of course, would be to write a "cleansing" script as daperl suggested, as most of the fluff comes from a handful of domains.

The surprising thing is that images, small or large, are now loading very quickly again in Tear. I don't get the connection with cookies - is this a webkit bug or what ?...
 

The Following 2 Users Say Thank You to fpp For This Useful Post:
Posts: 126 | Thanked: 94 times | Joined on Jun 2007 @ Berlin, Germany
#148
Perhaps it's simply resource contention, because the app is doing heavy work with the cookie data set in parallel to everything else that goes on while loading. Possibly I/O is the bottleneck - the cookie file is read in again and again at the same time as the images are written to cache or something. Probably needs profiling to determine.
 
Bundyo's Avatar
Posts: 4,708 | Thanked: 4,649 times | Joined on Oct 2007 @ Bulgaria
#149
Yes, the cookie file is read on every handle initialization... every resource. That is enough to slow it down on the tablet. I'll try to do something about it.
__________________
Technically, there are three determinate states the cat could be in: Alive, Dead, and Bloody Furious.

Last edited by Bundyo; 2009-05-03 at 14:35.
 
Posts: 126 | Thanked: 94 times | Joined on Jun 2007 @ Berlin, Germany
#150
I imagine that the various other users of WebKit on mobile devices (Apple, Google, Nokia, etc.) have solved this somehow.
 
Reply

Tags
browserd fix, microb killer, webkit


 
Forum Jump


All times are GMT. The time now is 00:24.