![]() |
home screen gestures
Anybody interested in having more gestures in home screen than swipe left/right for changing desktop?
Anybody able/willing to help? |
Re: home screen gestures
Quote:
I can help you ;) |
Re: home screen gestures
This thread sounds interesting, and it'll be really insane, if You guys actually implement smth new!!
|
Re: home screen gestures
smth? :o what is that
|
Re: home screen gestures
I could be wrong, but I found hd_home_desktop_do_press and some other functions in hd-home.c from package hildon-desktop. It should be "easy" to catch the moves from there.
For the gestures a 3x3 grid should be enough for some basic gestures. No ..8.. but ..L..U..N..Z.. sort of things |
Re: home screen gestures
Quote:
|
Re: home screen gestures
Quote:
I dunno, but i guess i pulled this word from some of my english knowledge, that i got either in my-native-language school or american school. smth stands for something |
Re: home screen gestures
Quote:
|
Re: home screen gestures
Quote:
Iam not good on programming :( |
Re: home screen gestures
Most help i need is c related. If you have basic c skills and an interest in maths, you could figure out the procedure for recognizing the stroke. Most if then else things i guess
|
Re: home screen gestures
It would be awesome to get a cube as swipe ;)
Normal 123456789 Cube: 123 456 789 So 5 is the center and from there you can get to 2,4,6 and 8 and it als should be possible to swipe from 9 to 3. That would be awesome ;) |
Re: home screen gestures
I like math but c.... I cant. but if you have any beginner guide ;) send it to me
|
Re: home screen gestures
OK thank you anyway, but...
|
Re: home screen gestures
Quote:
|
Re: home screen gestures
if you resize a 123
456 789 grid to the gesture (eg .. Z.. size 120x80 pixel) , the gesture goes through the fields 1235789. We need a function that tracks the "mouse pointer" and adjusts the grid in the screen. I have a simple mouse pointer application where you could test the success. The alternative would be look into hd-home.c and figure out, how to disable data handling there and hook up the above instead. |
Re: home screen gestures
Quote:
That is not what is being discussed here. OP was talking about dividing each desktop screen into smaller grids so as to make gesture tracking easier. So that we can make gestures like in Opera to do some actions (like launching maybe the phone) faster! Quote:
In this thread ( http://talk.maemo.org/showthread.php?t=74324 ), joerg was also thinking about something along the same lines. Talk to him, it might turn up something interesting! |
Re: home screen gestures
Quote:
As a fallback solution for myself I was thinking about disabling all clicks in hildon home and writing some daemon to watch moves. With my computer with phone, I can't send messages, so it'll have to wait, till I get to my computer without phone. In the meanwhile, anyone else... ? |
Re: home screen gestures
I second idea with systemwide, with activation on proximity sensor cover. Disabling click on hildon-home is a no-go, it would be PITA when using external mouse (or any mouse-like thing) via USB hostmode or bluetooth.
|
Re: home screen gestures
nice idea dude.....
it would be great... swipe left and right for desktops... but swipe up and down for quick access to messaging, and apps menu... e.g. |
Re: home screen gestures
...or Xterm! :D
|
Re: home screen gestures
I propose adding a gestures manager that allows utilities or applications to launch as a starting point using the proximity sensor.
Excuse my bad English. |
Re: home screen gestures
whatever, just make it configurable ...
|
Re: home screen gestures
The cube its a good idea, i remember my old cellphone (w580) has a similar option, check the video n_n minute 2:10
http://youtu.be/VVxUFptup8A |
Re: home screen gestures
So there's not that much interest in this. It'll take some time till I get something sorted with this (network in scratchbox).
Disabling clicks sure isn't a good idea, maybe disabling just stroke is. Anyway thats no no. 1 solution. It could use a config file, that's the easier part. @Leon Obscuro MX: The cube is no good idea. It's BS. However there are a lot who could like this, so you should start a thread for a discussion about this. My plan and the cube only have corners in common. @WhiteWolf: Don't understand |
Re: home screen gestures
Quote:
|
Re: home screen gestures
Yea, he thought about covering proximity sensor as a "triggeR" for gestures recognition, then, uncovering it would bring back old'n'goodie point and click mode.
There already is 100% working daemon for recognizing proximity sensor state (not using much resources, in fact, almost none at all), so this part of idea may be considered done. It's "only" matter of incorporating feature. BTW, as it was mentioned in some huge thread long time ago, it *is* possible to create multitouch ffor resistive screens. When you touch resistive screen in 2 place, actual "sensed" place is exactly in center between them (so, if You touch left and right part of the screen, device will sense 1 touch on center). Using some complicated algorithm, it's possible to recognize when there are many touch points at once, or it was just quick change of touching place. Someone (maybe qole, but I don't know for sure) ever mentioned devices, that incorporate that. If something like *this* could be done, that would be real overkill. |
Re: home screen gestures
My wife has LG Viewty which has very nice function like this, if you see lockscreen, you make a gesture (like W, V or S) and it starts an app linked to that gesture.
On Linux there's a program called Easystroke which can do things like this. Maybe you can borrow some ideas from their code. http://sourceforge.net/apps/trac/easystroke/wiki |
Re: home screen gestures
Quote:
Quote:
@mivoligo I am using easystroke, and I love it. I tried to port it, but failed. It's written in C++ and I am not familiar with this. Anyway I think it's to "big" for the n900 and the gui would have to be rewritten. So, a small solution would be enough for the beginning @WhiteWolf OK, thats the plan |
Re: home screen gestures
I tried shortcutd, but proximty sensor doesn' work well. Are there other alternatives? It is working elsewhere?
|
Re: home screen gestures
ShortcutD does work here, did you configure it in settings menu?
|
Re: home screen gestures
Hey, what You mean by "doesn't work very well"? For me (and bunch of other people) it's working flawlessly. Maybe it's more Your device side thing?
|
Re: home screen gestures
Quote:
|
Re: home screen gestures
xdotools from repositories help achieve that. just remember to se it up to send right/middle click *on release* not on click itself, because it's not possible to emulate one` click while doing another. On release works pretty well, thought.
|
Re: home screen gestures
Quote:
So as nice as proximity sensor activation is - make it configurable. And, if the gesture detection is done RIGHT, this wouldn't ever be necessary. All the background gesture detection daemon SHOULD do is process screen events and report to the rest of the system (over DBus or something else if there's more efficient channels, idk) that gesture [gesture number/name/whatever] was detected. Then applications should set up their own 'listeners' to such events, if they want to take advantage of specific gestures. You could, as part of the project, however, provide hildon-desktop/home patches to handle some of the gestures by default (I'm thinking swipes from bezel in either direction could be reserved for hildon-desktop, or at least up/down swipes). This would probably get CSSU'ed eventually if stable and good enough. I would say the minimal requirement for gesture detection should be swipe-from-bezel from each side (and if you want to go that far, from the corners of the bezel; optionally from screen-into-bezel, or bezel-to-bezel, but be careful with the bezel stuff because, as I say below, there's no touch-sensitivity in the bezels afaik, so you'd really be watching for gestures that begin or end at the very EDGE of the screen, which could interfere with normal functionality, like dragging text selection with the mouse in MicroB to the edge of the screen to select more text than fits on the screen at once - so I personally vote against screen-to-bezel, though bezel-to-other-bezel would provide some more options for programs to use), and the clockwise/counterclockwise rotation. These gestures could then be used by any application, hildon-desktop/home included. As far as I understand such an implementation would be perfectly compatible with MicroB and the like using gesture detection for the same gestures, since those are built into the app UIs by Nokia. BUT, it means that now an open source recode of the MicroB UI could be more doable, because you can pull the gesture detection from the system-wide daemon once that's out instead of writing an in-app one. Keep in mind that N900's screen bezel isn't touch-sensitive, so the swipe-from-bezel gestures would register as swipe-from-the-very-very-edge-part-of-the-screen-in. But I'm rambling, back to replying to stuff: Quote:
In turn, if you incorporate some of the gestures into system-wide UI functionality, e.g. having hildon-desktop detect a swipe-up-from-bottom-bezel and return you to the task switcher, to use the WebOS/BB Playbook gesture example, it should be something you are very confident wouldn't get triggered during regular use of an app - so swipe-up-from-bezel would be one such example, swipe-down-into-bezel or figure-8-swirl (if you have such gestures) would not be. Quote:
Something like it is doable, but it would be messy. Most of the code would probably be complicated extra 'error handling' trying to guess-work out which touches where consecutive and which were multiple. This is something I wouldn't mind getting into since I'm slightly less incompetent at C now (read: I'm extremely incompetent, but extremely < completely). The general idea shouldn't be too hard, once you know how to get the screen inputs with C (I don't, though); then you just do some basic arithmetic on the coordinate of the touch after every change, and for more complicated gestures like swirls, you'd need to break out some fancy maths, but I have that buried in me somewhere from my calculus learning days. |
Re: home screen gestures
OK I tested shortcutd again and it works
Quote:
Quote:
I would say you always need some trigger. As the desktop has only left/right swipes, it's easy to replace them. But pannable windows, games, text edditing needs a lot of black/whitelisting Swipe from bezel is ugly . If you can realize that academic stuff, you're welcome Maybe you've proven me all wrong in your text, but please : shorter! |
Re: home screen gestures
Sorry Sethka, but I can't agree with You. IMO, Mentalist did here brilliant job explaining things, and if you're too tired to read it now, get some sleep and try again. Because You better at least consider his suggestion, he wasn't sh*t talking, he really got a point there and You ignoring him will only bring You much frustration, if You decide to implement ideas not considering such useful feedback.
Some things just can be explained (on technical level, which, I think, is, or at least should be most appreciated here) in "shorter" way. And depreciating such effort is very rude from You. At least, You can be sure, that he used more brainpower/energy to write it, that You can use to read thoroughly ;) Just my 2 bitcoincents... |
Re: home screen gestures
Oh, I didn't want to sound rude, and yes it was time for sleeping, so sorry to Mentalist Traceur.
BUT I read it about four times and this - "All the background gesture detection daemon SHOULD do is process screen events and report to the rest of the system (over DBus or something else if there's more efficient channels, idk) that gesture [gesture number/name/whatever] was detected" ...is far away from everything I can do. For me that's academic. - "applications should set up their own 'listeners' to such events" ...this means the applications should be rewritten to work with gestures? - "This would probably get CSSU'ed"... never ever will something I wrote get into CSSU as I can't really write clean applications. "At least, You can be sure, that he used more brainpower/energy to write it, that You can use to read thoroughly" I wouldn't bet on that ( A supernoob needs more brainpower than you to understand what you are explaining. And MT sure needed much less time for everything related to his post than me. He seems to know a lot more than me, yes. But apart from that I was rude (what I didn't want to be yet again), where do you disagree? Sure there are more intelligent solutions and if someone comes around to do it, I will happily step aside and shut up. "he really got a point there"... please explain it to me. Maybe you could say he overestimated my abilities. I even had to search what bezel is. I don't want to sound rude now, I simply don't know what to do with MT's Post. |
Re: home screen gestures
Can implementae "xdotools" for applications that do not support the mouse on the N900 react to your commands? For example, the N900's native browser
|
Re: home screen gestures
Quote:
Secondly, I can see how MT's post can be overwhelming. What you can do is start off according to your abilities and own ideas and then changes can be implemented according to MT's post and other ideas. Don't worry, the greatest advantages to an open source project is the ability to collaborate. Once you start something, the you can ask other experienced developers for help, invite patches, etc. The first step is often the hardest and loneliest. Bon Chance. |
Re: home screen gestures
Quote:
|
All times are GMT. The time now is 05:37. |
vBulletin® Version 3.8.8