Human-Computer Interactions

Home Forums Technology Human-Computer Interactions

This topic contains 2 replies, has 2 voices, and was last updated by  Chris Cairns 9 years ago.

  • Author
    Posts
  • #165312

    Henry Brown
    Participant

    From Technology Review:

    Questions for MIT Media Lab associate professor Pattie Maes about the future of human-computer interaction in light of recent advances in mobile technologies:

    What will smart phones be like five years from now?

    Phones may know not just where you are but that you are in a conversation, and who you are talking to, and they may make certain information and documents available based on what conversation you’re having. Or they may silence themselves, knowing that you’re in an interview.

    They may get some information from sensors and some from databases about your calendar, your habits, your preferences, and which people are important to you.

    Once the phone is more aware of the user’s current situation, and the user’s context and preferences and all that, then it can do a lot more. It can change the way it operates based on the current context.

    Ultimately, we may even have phones that constantly listen in on our conversations and are just always ready with information and data that might be relevant to whatever conversation we’re having.

    How will mobile interfaces be different?

    Speech is just one element. There may be other things—like phones talking to one another. So if you and I were to meet in person, our phones would be aware of that and then could make all the documents available that might be relevant to our conversation, like all the e-mails we exchanged before we engaged in the meeting.

    Just like if you go to Google and do a search, all the ads are highly relevant to the search you’re doing, I can imagine a situation where the phone always has a lot of recommendations and things that may be useful to the user given what the user is trying to do.

    Another idea is expanding the interaction that the user has with the phone to more than just touch and speech. Maybe you can use gestures to interact. Sixth Sense, which we built, can recognize gestures; it can recognize if something is in front of you and then potentially overlay information, or interfaces, on top of the things in front of you.

  • #165316

    Chris Cairns
    Participant

    I’ll buy that phone! There will be a lot of amazing technology that comes out when responding to contextual info matures.

  • #165314

    Corey McCarren
    Participant

    Phones will hopefully be largely made out of this stuff — graphene — five years from now.

You must be logged in to reply to this topic.