![]() |
2010-12-14
, 20:48
|
|
Posts: 7,075 |
Thanked: 9,073 times |
Joined on Oct 2009
@ Moon! It's not the East or the West side... it's the Dark Side
|
#12
|
![]() |
2010-12-14
, 20:54
|
Posts: 1,042 |
Thanked: 430 times |
Joined on May 2010
|
#13
|
you can do that already, i think the difference is you want the device to actually listen in future...
![]() |
2010-12-14
, 20:56
|
Posts: 961 |
Thanked: 565 times |
Joined on Jul 2007
@ Tyneside, North East England
|
#14
|
![]() |
2010-12-14
, 20:57
|
|
Posts: 434 |
Thanked: 990 times |
Joined on May 2010
@ Australia
|
#15
|
![]() |
2010-12-14
, 21:07
|
Posts: 1,042 |
Thanked: 430 times |
Joined on May 2010
|
#16
|
the future is probably here already given the interfacing capable through the likes of Lonworks, C-Bus, Honeywell, Siemens etc.
writing the code to utilise existing phone based hardware sensors and capabilities would be the catch and IMO (since most seem to only write for IOS) there'll be "an app for that" being released in short order.
the arse wiping is still gonna be the hard part. although my N900 has nice smooth rounded edges, the lack of opposable thumbs is going to cause problems with toilet paper manuipulation....... to fold or to scrunch, that is the question
![]() |
2010-12-14
, 21:07
|
|
Posts: 1,210 |
Thanked: 597 times |
Joined on Apr 2010
@ hamburg,germany
|
#17
|
![]() |
2010-12-14
, 21:42
|
|
Posts: 861 |
Thanked: 734 times |
Joined on Jan 2008
@ Nomadic
|
#18
|
The Following User Says Thank You to ARJWright For This Useful Post: | ||
![]() |
2010-12-14
, 21:51
|
Posts: 1,326 |
Thanked: 1,524 times |
Joined on Mar 2010
|
#19
|
Most current systems and aps could be improved with a bit of "self awareness". We are not necessarily talking big brother here, just connecting and using existing data.
A classic example would be my navigation app learning the shortcuts I take everyday, and use them as preferred routes.
In my calender, I have entries, for being on-call, which site I will be at on a work day, annual leave etc. It cannot be too much of a leap for my phone to identify, if, when and where I will be at work, and thenprogram the nav app accordingly, ready for it to leap into action when I start the car, and the device detects and connects to the Car stereo via bluetooth.
A fancy bit of scripting could be enough, launched via a d-bus script to interrogate the calendar firstly for days when I am at work (i.e. not weekends and leave), and then for the location of where I am supposed to be at, and then launch the satnav app with it as a destination.
Another lovely little app, is the beta app on symbian, which identifies your most used shortcuts, and place them on the homescreen.
![]() |
2010-12-15
, 20:16
|
Posts: 961 |
Thanked: 565 times |
Joined on Jul 2007
@ Tyneside, North East England
|
#20
|
The SVP/video that the OP was referring to - and the very open talk around BetaLabs about Bots, Feel, and Situations pretty much points to this idea of *some* device intelligence being added to the default UI. It really is a matter of time, and they've been getting some good feedback (it seems) from those three apps.
Personally speaking, Bots is one core reason that I can't see myself going to another mobile platform just yet. Yes, there are mobile platforms where you can download a 3rd party application, and then program it to do this or that given certain parameters. It is another thing for the device to "learn" and "adapt" without much user attention being taken. If you will, there's no need to train because the platform is built to learn from the outset.
This is what is meant by devices being smart, and I welcome it - as long as we can see it implemented in small steps (at least at first).
![]() |
Tags |
could tmo, get any worse, hot air baloon, skynet |
|
sarcasm may be the lowest form of wit, but its the only wit i have.
its a sad day when i can't slip at least one hitchhiker reference in somewhere.