This past week was spent getting further acquainted with the subject matter of my project. I decided to read another piece of background research–the paper, titled “Distribution of Semantic Features Across Speech & Gesture by Humans and Machines,” is an analysis by MIT researchers Justine Cassell and Scott Prevost on how computers interact with people. The paper was a great read and I’ll undoubtedly be incorporating elements of Cassel and Prevost’s research into my own–however, I found the paper to be less directly relevant than I had hoped. A heavy emphasis was placed on the importance of gesture in human communication, an idea that I found to be extremely interesting but less relevant to my own research. Prevost and Cassell stressed the importance of physical gesture in communication, so much so that in addition to writing an algorithm that made use of semantics and world analysis, they gave the algorithm a rendered 3D model to physically reinforce the meaning of metaphors and other vague expressions. Prevost and Cassell offer an expression such as “that folder,” which translates meaning but lacks the specificity of a phrase such as “the folder on top of the stack to the left of my computer.” The information that is not explicitly stated would be otherwise transmitted non-verbally with a gesture such as a point.
I don’t intend to code a 3D model with my project, because I don’t deem it necessary for my methods to have a 3D model–the methods, after all, are only concerned with metaphors. This is why Cassell and Prevost’s paper is slightly less relevant than I had hoped, but their paper did give me valuable insight into how CDSMs function and how one can effectively analyze human interaction on a computer.
For these first couple weeks, in addition to my main research, I’ve also been refreshing myself on Python, the language I will be using to code my project. It’s been a while since I last used Python, and it feels good to get back into the swing of things!