Human-Computer Interaction & User Interface Technology
Nothing's the same anymore
—Commander Jeffery Sinclair from Babylon 5 Chrysalis
Of all the work I did in graduate school, this was the most important to me.
It really began with my taking a Graphic Design class. In this class, a student announced one of her design projects was going to be a mobile app that would help Transgender people find gender safe bathrooms. The professor, along with the rest of the class, began to nod their heads and the following mumbled consensus was this idea-- was a good one!
The acceptance just blew me away!
The idea that a cisgendered person would propose such an idea, and that it would be accepted by other cisgendered people was outside what I believed possible in the world. But it happened. In all honestly, that was the single most important thing I learned during my Graduate School tenure.
So, I proposed a trans based project to my Human-Computer Interaction & User Interface Technology class. From there, myself and another student ended up doing research on how advances in Human and Computer Interaction may help the Transgender community express the gender they identify with. What follows is a description of the applications we developed with myself leading development on the creation of the Voice Training and Gait desktop applications, my partner working on the Achievement Tracker.
Transgender people have used audio wave analysis tools to try and find a voice that will encourage cisgendered people to gender them appropriately. However, the output of these applications was thought to be confusing by those inexperienced with it. So I developed an application with the output above.
The bottom bar represents the common male range, the androgynous range, and the common female range in blue, purple and pink respectively. They both capture 10 minutes of speech from YouTube videos where in the first, a male was the speaker and in the second a female was the speaker.
The area above the bottom bar is a heat map of the frequencies the user was hitting while speaking. The more often a particular Hz reading was found, the hotter that part of the heat map got. Interestingly, the female voice looks a little low, but no other pictures are available for this as this finding was repeated over all samples. However, what was expected and repeatedly found was women talk over a greater range of frequencies than men as women didn't get the hot spot men do.
Thus, the idea was to have trans men and trans women try and generate heat maps that looked like their target gender. Beyond the bar, there was no indication of maleness or femaleness by the system. This was an intentional feature of all the applications. The goal here was to help trans people feel more confident in their ability to express the gender they identify with. The last thing we wanted to do was enable the use case of a trans person having the system tell them they've achieved something like a 69% target gender rating and then be unable to improve it further. While computed gendering may be desirable to some, the risk of doing more harm than good struck us as too high.
To further assist the user with achieving the voice of their target gender, MIDI tones could be generated via the keyboard and would display a bar on the screen so the user could see where the frequency was. This not only helped with the calibration of the software and the confidence the Fast-Fourier Transform was being applied correctly, it was to help users get their voice to a pitch in their target gender range.
The Gait project took the greatest amount of iteration to execute.
To start, we used BioMotionLab's BMLWalker algorithm in order to create our template walker for the user to compare themselves to. Then, in order to change the walker and alter other parameters, a series of voice commands were implemented. Finally, the ability to record a walking session was added as was the ability to isolate a sub-clip in the walking session for later analysis by the user.
The biggest problem was getting a long enough walk cycle to actually be able to do analysis. A single Kinect could work but, not all that well. So, I had the idea to use Kinects in series and use them like cell phone towers hand off an active phone call.
|Kinect 0 Tracking
||Kinect Hand Off
||Kinect 1 Tracking
There was a problem with beam interference with this setup. However, that was handled easily enough!
It did work, albeit and honestly on the rough side.
Like my Spatial User Interfaces project, this project did require me to pull from my practical experience. Due to how the Kinect SDK could be used at the time, and perhaps still, there was one process running for each Kinect so both could be observing at the same time. These needed to communicate the skeleton information back to the display application in order to feed it the user data it needed. This was accomplished through the use of Memory Mapped Files and Windows Events.
The final application was a HTML mobile application for tracking transition achievements. This program's development came to be via the assumption that a very important goal that, transpeople have is making sure they can live as a member of their target gender. Using this application, they'd be able to track their accomplishments and, in conjunction with their therapist, set new goals for themselves between sessions.
This aspect of the research continues to live on. On the side, I'm currently developing an application that will allow transpeople to generate little info graphics they can put at the bottom of forum postings to announce their transition progress.
We didn't get IRB approval to test our application suite on test subjects in time because who we needed to be test on and what we were to test needed to be clearer before we were to be granted permission. It is unfortunate, but also why some work continues outside of the restrictions of academia.