New Technology Makes Digital Physical

With all the technology around us, it is hard to imagine what the future will hold and how that future will impact our lives. For this reason, I believe that it is important to find the latest technology being advanced in university settings and see if it can be used to impact the lives of people with Learning Differences. In other words, who is doing some radically different thinking about our world?

One such person is Dr. Hiroshi Ishii, Head of Tangible Media Group at the Massachusetts Institute of Technology (MIT) and Co-Director of Things that Think (TTT) Consortium. Those titles give you some idea of how Dr. Ishii is on the edge of technology and its impact on our lives. In an interview with Olivia Solon from wired.co.uk, Dr. Ishii defines his interest very clearly, “My lifespan is running out so I need to focus on the most, crazy, edgy stuff…” The technical description of his work can overwhelm the average person, so here’s my definition: Dr. Ishii is working to make the interfaces of our world interact with us on a more physical level.

Since the advent of micro technology and integrated circuit technology, developers have slowly changed how we interact with things we use. The television (TV) is a very good example. We used to have to get up out of our chair to change the channel with a dial (What? You didn’t experience that?). Next, we had a panel on the front of the TV that we would use to change the channel. Now, all of us use some kind of remote control. There may be buttons, but most of it is becoming less tactile. The latest gadget to control your TV is voice remote control. We have less and less direct physical contact with the world around us. For those of us with limited physical movement, this might be just what we want. For those of us with speech, hearing, or vision problems, this lack of physical contact may make our interaction with devices more difficult.

At Dr. Ishii’s Tangible Media Group, they have developed a new kind of display/user interface calledinFORM. They define it as

“a Dynamic Shape Display that can render 3D content physically, so users can interact with digital information in a tangible way.”

Don’t expect me to put that into layman’s terms. The best thing to do is to watch the video. Note that at the very beginning the person can remotely move objects just by moving his hands and at about 2:44 you can touch and feel math equations.

So what?! There are many people with Learning Differences who have difficulty interacting with people in a busy, noisy environment. If these people could work from home and yet physically interact with the environment where their coworkers are working, they would be able to have a more meaningful position in the workplace.

And the advantages to seeing and feeling math? My bachelor’s degree was in math with a minor in physics. I spent hours doing three dimensional math and yet I could not picture in my head what any of these equations represented. I just couldn’t see it. It would have made so much more sense to me if I could have seen what the math equation was representing in the physical world. I could see it not from a theoretical perspective, but from an object in the real world - it would suddenly make math real and interesting. I know that I’m not the only person out there who could benefit from this type of display.

So, to Dr. Ishii and his students at MIT, I say “Congratulations!” and keep working on finding new ways for us to physically interact with our world. I hope that some enterprising entrepreneur will figure out a way to turn this lab creation into something that we can use in our everyday life.