I was walking back to my apartment when all of a sudden, I wanted some Moe’s. It has been a while since I had it, so I walked over to Tech Square, which was the nearest location. By the time I arrived, I noticed that it was closed. It was sad, but another food place was open, so I went there instead. It was interesting to observe how many people go in and out to all the food places around Tech Square, and many are probably disappointed and in disbelief when they realize it is closed.
A sound-directed interface would be very helpful for this activity. When people are looking around Tech Square for food, if a sound was by the door after the person passed a certain point, it could be helpful in knowing if the area is closed. This helps people who can see and those who also cannot. For example, a person who is unable to see might just walk up to a door and try opening it several times thinking the door would just not open. This user, like in my situation, would greatly benefit if there was a sound informing him about the situation instead of wasting time and energy trying to open the door of a closed store.
Since just checking if a place is open is a very quick activity, the sound, too should also be very quick. A speech-based interface would be much more excessive, and may come off annoying if several people are trying to get into the restaurant. However, since it is sound-based, it should be clear whether something is closed or open, so the sound should have a very high-pitched, happy sound if it open, or a low-pitched, sad sound if it is closed. There should also be a certain delay when one person walked by, so if a group walks by, it again does not annoy the users. The goal is to be quick, informative, but not annoying.
When someone presses the button to cross a street near Piedmont Park and the red hand is showing, they are met with a succinct command to not cross the street. While the intersection does not have nearly as many people crossing as Tech Square, a verbal interface is a helpful addition to the crosswalk that could provide both additional safety and accessibility to the higher traffic intersection.
Firstly, a verbal interface on the crosswalk will provide an extra safety net for people who are not paying attention while they are walking. The most common action I observed in Tech Square was that people would look down at their phone while they walk. While using a phone is not inherently bad, the distraction can cause people to not pay attention to traffic coming in the street. With audio cues, people will be given a friendly reminder to “wait” before trying to cross. When the light changes, another audio cue can remind people it is safe to walk if they are absorbed in their phone and not paying attention to the changing light.
The second benefit of a verbal interface is that it can provide an accessible way for blind people to move safely across the street. At a normal intersection, the only indications of when someone should cross are the lights, and the changing sounds of traffic. The indicator lights are useless for blind people, and the sounds of traffic are simply not reliable. Verbal cues would solve this problem by creating a reliable indication of when it is safe to cross, but using a different sense than visual.
At the intersection of Spring St. and 5th, there are many tasks that are trying to be performed. Active and passive alike, both are shoved into the same small area. Cars rush by while pedestrians hurry across the street and students cram for their upcoming test. Students sometimes study and read at the tables outside of Starbucks, trying to concentrate over the sounds of cars, passersby and others sitting around them. I think that a device that could assist in this task would be a device at each of the tables that uses active noise control technologies. This is the same technology that is used in noise cancelling headphones. It basically sends out another noise wave to specifically cancel the first. This method would create a more focused atmosphere for the user. This device would not only be applicable for a person who was studying or reading, but also for another task that was taking place at the intersection, several meeting that were going on. It doesn’t even have to be a noise that just cancels out unwanted noise, it can improve the atmosphere too. For example, a pleasant noise could include relaxing music that not only cancels with the sounds of traffic but also puts the user at ease, contrasting from the otherwise stressful and high energy atmosphere of the entire intersection. A more cheerful conversation among friends could be assisted with lighter, upbeat music. With this method, the intersection can both be a place of movement and of stillness, that can be used for multiple activities, and doesn’t favor either. This interface would be better as a sound directed interface because if so it would be able to sense the tone in the conversation and create an atmosphere that matches the tone. This is an example of similarity attraction, because if the user has a tone of ambience that is similar to the tone of their conversation, it creates a better experience for the user. I believe that an interface directed at creating a more focused environment for those who study or have meetings at the intersection, that eliminates unwanted sound and adds to the task by mimicking the tone of the conversation. This would help to create a small bubble of solitude among the chaos.
Tables outside Starbucks in Tech Square