Contact Us

Use the form on the right to contact us.

You can edit the text in this area, and change where the contact form on the right submits to, by entering edit mode using the modes on the bottom right. 


123 Street Avenue, City Town, 99999

(123) 555-6789


You can set your address, phone number, email and site description in the settings tab.
Link to read me page with more information.



For people on-the-go, Voca lets them send text messages by voice, eliminating the need to type.

2 Month MVP | Work Project | Live on the Play Store

Project Overview

The Problem

Texting by hand is often tedious and mistake prone, especially when in a hurry. Why can't texting be as easy as talking? We wanted to create a product that allowed the composing and editing of messages by voice to be intuitive.

Project Goal

Initial phase to develop a minimum viable product that accurately sent messages by voice, additional phase taking feedback from initial users to refine the product. We also introduced feature for a new user: the driver.

My Role

I was the Product Designer throughout the project (from idea conception to the present), working directly with the Product Manager. 

The User

Initial users were users that used the Robin voice assistant app. We then further refined the target user to be drivers who used voice apps while driving.

The Design Process

Since the three main steps comprised of: 1) selecting a contact, 2) composing the message, and 3) sending the message, the screens we focused on developing first were the contact screen and the dictation screen

I started creating some initial sketches of these two screens. I choose the "most promising" sketches to create simple sketched prototypes, in order to test and further explore the interaction between the two screens.

Contact Screen

The contact screen was the simpler of the two screens, so we started there. We realized that it was more complex than originally anticipated. For instance, how would the avatar look with and without a picture? How do the new messages look on the contact screen? What information is necessary for the user to have on this screen?

initial contact screen sketches

Finished contact screen with the newest messages coming in on top. (right) We experimented with having new messages come in on bottom; I initially thought it would be easier to reach the new messages with one hand, but it confused users so we reverted to the standard order of having the most recent on top.

Finished contact screen. 


Dictation Screen

Developing the dictation screen took a lot more work than the contact screen, as there were a number of things to consider. For example, how does the user turn on and off the microphone? How does the user know when the microphone is on or off? How does the user intuitively edit their message? Finally, how does the user send the message?

In the early draft of the dictation screen, users did not understand that the animation behind the the texting area meant the microphone was on and listening. There was another problem as well--what if the user was in a quiet spot and could not dictate their message aloud? Or, what if they were around people that they did not want them hearing their messages? 

a few early animation sketches

Early sketches of the dictation screen

Early dictation screen draft


To solve both problems we connected the animation more directly to the microphone with elements showing in front and behind the texting screen. We also added the capabilities of typing your message and changed the prompt to from "say message" to "say message or tap here". The tapping action then brings up the keyboard and turns off the microphone.

As we continued to iterate our design, we next realized that we needed a way to see the message thread. This affected the dictation screen because we did not want the user to take an extra step to view their message thread (i.e. go from a) contact screen, to b) message thread, to c) dictation screen), so we decided to implement the message thread into the dictation screen.


Previous Messages

My initial design had the message thread in a drawer to the right, so that it was easily available by swiping left from the dictation screen, but this design was not direct enough. The last message needed to be right there on the page without requiring the user to do an extra swipe to view the message thread.

Early sketch of deciding how to implement previous messages with the dictation screen.

message thread available by swiping left


To give the message thread greater visibility, we moved them to the top of the screen where the last few words of the conversation are always in sight, with the full conversation available by swiping down. 


As we continued to develop the app, we realized one of the key user functionalities was to make editing texts by voice more intuitive.  Some of the commands that were intuitive were: "Keep the last part," or "Delete the second part", etc. Since people do not naturally "spit out" their entire texts in one breath, we had to find a way to visually show each separate utterance for the editing process to work properly. In order to show the user how their phrases were understood, I selected a number of colors that would highlight the text without making it unreadable.


Users have responded really well to the app, and it has an average of 4 stars on the Play Store despite only being an MVP.

“This app is great! I love that if you don't want to speak your texts at the moment you don't have to jump through firey rings with glass on the other side to switch to keyboard I made this my default messaging app mainly because I drive a truck and its safer to speak texts this app does what I need with Big buttons so I don't have to squint to send a message”  — Tom Steketee, Playstore Review