Team:BU Wellesley Software/Notebook/ConsueloNotebook
From 2011.igem.org
Line 94: | Line 94: | ||
This is Consuelo's Notebook! Check back for more awesomeness! Follow me on Twitter: [http://twitter.com/#!/ConsiV ConsiV] | This is Consuelo's Notebook! Check back for more awesomeness! Follow me on Twitter: [http://twitter.com/#!/ConsiV ConsiV] | ||
<br> | <br> | ||
+ | |||
+ | <html> | ||
<script src="http://widgets.twimg.com/j/2/widget.js"></script> | <script src="http://widgets.twimg.com/j/2/widget.js"></script> | ||
<script> | <script> | ||
Line 124: | Line 126: | ||
} | } | ||
}).render().setUser('consiv').start(); | }).render().setUser('consiv').start(); | ||
+ | |||
</script> | </script> | ||
+ | </html> | ||
+ | |||
<br> | <br> | ||
== Jun 29: eLab Notebook First Computer Prototype == | == Jun 29: eLab Notebook First Computer Prototype == |
Revision as of 15:20, 28 September 2011
Consuelo Valdes's Notebook
This is Consuelo's Notebook! Check back for more awesomeness! Follow me on Twitter: [http://twitter.com/#!/ConsiV ConsiV]
Jun 29: eLab Notebook First Computer Prototype
Working on eLab Notebook. Having a brainstorm session with Kelsey while Michelle and Kathy are at the Wet Lab today. We got some designs based on their observations. We are trying to make it conform to the Apple Interaction Guidelines so that it could be successfully deployed in the App Store.
Kelsey and I are currently implementing two alternative implementations for navigation. Kelsey is implementing a version that uses a segmented control to switch between functions. This is a useful implementation because we can easily call the switching function from anywhere; perfect for voice or gesture integration. My implementation stayed truer to HIG in some ways as it employed a root navigation controller. The issue is that we also wanted to use split views as well and they are not necessarily meant to go together.
We got the basic views for the protocols view, the lab state/organization, map, calendar, and a main workspace for notes.
Jun 30: eLab Notebook Final Design Session
Today the eLab Notebook team got together to discuss final designs for the Lab Notebook. We incorporated some redesigns by Kathy and Michelle to support more lab functions. We also incorporated suggestions from Traci that she expressed during the team brainstorming session on Monday.
We've also decided that we are going with the segmented control implementation because switching functions from a navigation controller is a little more convoluted and does not comply to HIG because of the combination of controls.
Jul 5: Almost have protocols view completely functional.
Today I focused on getting the protocols page to work in a manner that supports visual continuity, editing capabilities, notes, and parses the protocols from a plist or an excel document (how they are currently saved.
There were several ways that we considered implementing the protocols view. One way was to have a check box next to each step in a protocol and have the user see each step at a time with notes and images corresponding to each step. An alternative was to use a flow chart instead of a list with check boxes because there are occasional or cases in the steps which cannot really be presented in a list. The trick is to create the branches dynamically and come together after the branch in the progression from one step to another.
Jul 6: Protocols DONE!
Jul 7: Got app to switch from voice commands.
After doing some research for available APIs for speech to text and gesture recognition on the iPhone, I found OpenEars. The API allows you to take in their app and make custom libraries or use their libraries for voice recognition.
After getting OpenEars integrated into the eLab Notebook, I tested the switching method with some phrases and got it to successfully switch! Now all that's left is to set up a dictionary.
Jul 14: So much for the dragon SDK.
After Kelsey worked on getting a dictionary integrated into the eLab Notebook and had such a hard time having words actually understood, Orit suggested we switch to Dragon. Dragon is a well known speech to text/text to speech API that has been used on Mac and PC platforms. There is also an iPhone app. Unfortunately, after doing some research it looks like a license is required in order to use Dragon and it is pretty pricey ($1,000). So, it does not seem like a viable option for the eLab Notebook.
Jul 21: Working on application icons!
Aug 1: First visit to BU!
Today we had our user study at BU. Everyone seemed to have a great time using the system! There were no crashes. And we got some great quotes and footage from the Wet Lab team! Go BU_WELLESLEY!!! :D