Since the app was released a surprising number of users have found the app and have been using it. Users have been requesting features and providing feedback on the Play Store and Chrome Store.
Some teachers have even tweeted about the app!
This summer Veronica will be looking over iLanguageCloud user reviews in order to document what needs to be done in the next releases. First she found that most of the reviews indicate that there are different user groups who have different goals when they open the iLanguageCloud project. Some users want to paste a full text and see a cloud, but most users want to see all the words they paste.
She started by identifying the user types with a CouchBD map reduce and learning how to do statistical analysis in LibreOffice. Once she had identified stats to categorize user types, she added tests for these user types in the codebase using Jasmine.
Users are often creating tag clouds, not full text clouds. We attribute this to users being used to having to pre-filter their words to only the words they want to show with random text sizes rather than text size which depends on their frequency or other factors.
While she is learning the tools (Angular.js, Travis) to make the modifications so that her user types tests pass, Veronica created a video tutorial showing how you can use the Chrome app so that users can have some instructions.
This week lab members Farah and Gina will be talking about how to setup and tweek Sikuli tests for Android at GDG Android Montreal. In this talk they show how you can test image heavy, and/or legacy/hybrid android apps using OpenCV (computer vision) and Sikuli.
Sikuli is a framework which automates anything you see on the screen. It uses image recognition to identify and control GUI components. It is useful when there is no easy access to a GUI’s internal or source code, or writing tests crosses layers of technologies ie in a Cordova/HTML5 app running in a webview.
Sikuli is an open source project started at MIT which has grown to be used by developers for diverse types of clicker testing.
Here is a video showing how Farah used Sikuli to test a Cordova/HTML5 app running in an Android webview.
Tonight Gina and Esma will be presenting their Kartuli Speech Recognition trainer at Android Montreal.
The talk will shows how to use speech recognition in your own Android apps. The talk will start with a demo of the Kartuli trainer app to set the context for the talk, and then dig into the code and Android concepts under the demo. The talk has something for both beginner and advanced Android devs, namely two ways to do speech recognition: the easy way (using the built-in RecognizerIntent for the user’s language) and the hard way (building a recognizer which wraps existing open source libraries if the built-in RecognizerIntent can’t handle the user’s language). While Gina was in Batumi she and some friends built an app so that Kartuli users (code) (slides) (installer) could train their Androids to recognize SMS messages and web searches. Recognizing Kartuli is one of the cases where you can’t use the built-in recognizer.
How to use the default system recognizer’s results in your own Android projects,
How to use the NDK in your projects,
How to use PocketSphinx (a lightweight recognizer library written in C) on Android
After talking with members of the TLG volunteers (Teach Learn Georgia) when they come down from the mountains for the weekend, it looks like older volunteers (August 2013) could share what they have learned in the field with newer volunteers (March 2014) using our open source code base called “Learn X” which makes it possible to create an Android App that one or many users can use to create their own language learning lessons together using their Androids to take video, picture or record audio, backed by the FieldDB infrastructure for offline sync. Like heritage learners, TLG volunteers spend their time surrounded with the language and can understand more than they can speak, and what they speak about is highly dependent on their families and what their family speaks about most.
This week Josh released iLanguage Cloud alpha on Google Play. iLanguageCloud is a fun Android application that allows users to generate word clouds using an Android share intent. Clouds can be exported as SVG for use in high-res graphics applications such as InkScape or as png for sharing with friends on Facebook, Google+, Twitter, Flickr or any other social applications a user choses.
Create, save, and share word clouds from text on any website or any another application, share your word cloud with friends and colleagues.
This weekend Gina filled in for Android Montreal as Android Mentor at Start-up weekend. Her favourite teams were GymFocus who pitched an Android app which even got letters of intention from local gyms, and the Silent Disco Squad who used SoundCloud to create a synchronized HTML5 audio experience.
This month Gina travelled to Shenzhen China, a major manufacturing port of China to launch one of our closed source partners’ Android data collection app. While there she visited the real “Android Market” of Shenzhen.
Lab member Gina led a hand on workshop at Google Montreal as part of the All Girls Hack Night. The workshop shows how to get “up and running” with Android Intents in a three part tutorial, resulting in a gesture and/or voice remote control Android app.
Over 80 registered for the hacknight.
Try it out! Here are the installers for each step of the workshop: