Hungry Again?

I was walking back to my apartment when all of a sudden, I wanted some Moe’s. It has been a while since I had it, so I walked over to Tech Square, which was the nearest location. By the time I arrived, I noticed that it was closed. It was sad, but another food place was open, so I went there instead. It was interesting to observe how many people go in and out to all the food places around Tech Square, and many are probably disappointed and in disbelief when they realize it is closed.

A sound-directed interface would be very helpful for this activity. When people are looking around Tech Square for food, if a sound was by the door after the person passed a certain point, it could be helpful in knowing if the area is closed. This helps people who can see and those who also cannot. For example, a person who is unable to see might just walk up to a door and try opening it several times thinking the door would just not open. This user, like in my situation, would greatly benefit if there was a sound informing him about the situation instead of wasting time and energy trying to open the door of a closed store.

Since just checking if a place is open is a very quick activity, the sound, too should also be very quick. A speech-based interface would be much more excessive, and may come off annoying if several people are trying to get into the restaurant. However, since it is sound-based, it should be clear whether something is closed or open, so the sound should have a very high-pitched, happy sound if it open, or a low-pitched, sad sound if it is closed. There should also be a certain delay when one person walked by, so if a group walks by, it again does not annoy the users. The goal is to be quick, informative, but not annoying.

The Rush Hour Tango

Intersections are not merely crossing points between streets; they are hubs of activity and nodes within communities where interactions and routines take place. Ideally, drivers and pedestrians would cross paths in an organized and timely fashion, like an intricate dance. However, impatience and human error often lead to one party stepping out of turn. Drivers are legally allowed to turn right through red lights, and pedestrians sprint across the street at any chance they get, regardless of the walk signal. Busy intersections like 5th and Spring St need an interface that can increase safety and awareness for people both on the road and on the sidewalk.

According to the Federal Highway Administration, approximately 65,000 pedestrians are injured by moving vehicles in a given year. In 2014, almost 5,000 of those injuries resulted in deaths. From a pedestrian’s perspective, crossing should be straightforward if you follow the visual walk signals. Additionally, the sounds of loud trucks, buses, and cars speeding down the street act as an indicator of when to or not to cross. For someone who is visually impaired or easily distracted, on the other hand, a direct and alert auditory signal might be a beneficial instruction. Many crosswalks are now implementing buttons that speak to users when pushed, emitting a forceful yet poor sound quality “wait!” or “crossing!” I would improve this interface by making the voice more clear and neutral, as the signal should be neither alarming nor calming; its purpose is to relay information, not an emotion.

From the driver’s perspective, sitting at a traffic light certainly is not an enjoyable experience. Although many states have laws protecting pedestrians in crosswalks, drivers attempt to make quick turns or speed across intersections without paying attention to the people around them. Distractions from phones, radios, and activity outside the vehicle all take away from the task at hand. I would improve driver awareness by including a signal in the intersection interface that is emitted to vehicle radios or possibly cellphones that alert a driver when pedestrians are crossing. Such a voice would need to be neutral, as Clifford Nass and Scott Brave state that “the same voice cannot be effective for all drivers.” It would be difficult for an outside interface to detect a user’s emotion from within the vehicle, so neutrality is key to appealing to the masses.

A Guiding Voice

“Wait!”

When someone presses the button to cross a street near Piedmont Park and the red hand is showing, they are met with a succinct command to not cross the street. While the intersection does not have nearly as many people crossing as Tech Square, a verbal interface is a helpful addition to the crosswalk that could provide both additional safety and accessibility to the higher traffic intersection.

Firstly, a verbal interface on the crosswalk will provide an extra safety net for people who are not paying attention while they are walking. The most common action I observed in Tech Square was that people would look down at their phone while they walk. While using a phone is not inherently bad, the distraction can cause people to not pay attention to traffic coming in the street. With audio cues, people will be given a friendly reminder to “wait” before trying to cross. When the light changes, another audio cue can remind people it is safe to walk if they are absorbed in their phone and not paying attention to the changing light.

The second benefit of a verbal interface is that it can provide an accessible way for blind people to move safely across the street. At a normal intersection, the only indications of when someone should cross are the lights, and the changing sounds of traffic. The indicator lights are useless for blind people, and the sounds of traffic are simply not reliable. Verbal cues would solve this problem by creating a reliable indication of when it is safe to cross, but using a different sense than visual.

1 2 3 4