Category Archives: Android

Taking a look at iLanguageCloud user reviews

Its been a few years since Josh originally released the iLanguageCloud project. The iLanguageCloud project uses Jason Davies D3.js cloud library and some statistics to tokenize and identify stopwords so that it can support text in any unicode charset in any language.

Since the app was released a surprising number of users have found the app and have been using it. Users have been requesting features and providing feedback on the Play Store and Chrome Store.

Using @iLanguageLab word cloud to collect & display words to describe the moon. One S uses Word Central for help!
Some teachers have even tweeted about the app!

This summer Veronica will be looking over iLanguageCloud user reviews in order to document what needs to be done in the next releases. First she found that most of the reviews indicate that there are different user groups who have different goals when they open the iLanguageCloud project. Some users want to paste a full text and see a cloud, but most users want to see all the words they paste.

She started by identifying the user types with a CouchBD map reduce and learning how to do statistical analysis in LibreOffice. Once she had identified stats to categorize user types, she added tests for these user types in the codebase using Jasmine.

Users are often creating tag clouds, not full text clouds. We attribute this to users being used to having to pre-filter their words to only the words they want to show with random text sizes rather than text size which depends on their frequency or other factors.


While she is learning the tools (Angular.js, Travis) to make the modifications so that her user types tests pass, Veronica created a video tutorial showing how you can use the Chrome app so that users can have some instructions.


To help decide features get done first visit our GitHub feature list.

Testing Android with Computer Vision

This week lab members Farah and Gina will be talking about how to setup and tweek Sikuli tests for Android at GDG Android Montreal. In this talk they show how you can test image heavy, and/or legacy/hybrid android apps using OpenCV (computer vision) and Sikuli.

Sikuli is a framework which automates anything you see on the screen. It uses image recognition to identify and control GUI components. It is useful when there is no easy access to a GUI’s internal or source code, or writing tests crosses layers of technologies ie in a Cordova/HTML5 app running in a webview.

Sikuli is an open source project started at MIT which has grown to be used by developers for diverse types of clicker testing.


Here is a video showing how Farah used Sikuli to test a Cordova/HTML5 app running in an Android webview.




Curious about the code? You can take a look at Farah’s Sikuli tests on GitHub.


Recognizing Speech on Android

Tonight Gina and Esma will be presenting their Kartuli Speech Recognition trainer at Android Montreal.
14-08-20 - 1
The talk will shows how to use speech recognition in your own Android apps. The talk will start with a demo of the Kartuli trainer app to set the context for the talk, and then dig into the code and Android concepts under the demo. The talk has something for both beginner and advanced Android devs, namely  two ways to do speech recognition: the easy way (using the built-in RecognizerIntent for the user’s language) and the hard way (building a recognizer which wraps existing open source libraries if the built-in RecognizerIntent can’t handle the user’s language). While Gina was in Batumi she and some friends built an app so that Kartuli users (code) (slides) (installer) could train their Androids to recognize SMS messages and web searches. Recognizing Kartuli is one of the cases where you can’t use the built-in recognizer.
  • How to use the default system recognizer’s results in your own Android projects,
  • How to use the NDK in your projects,
  • How to use PocketSphinx (a lightweight recognizer library written in C) on Android

Live broadcast on YouTube
Code is open sourced on GitHub

Week 1-3: Taking Learn X from clickable prototype to field testing

After talking with members of the TLG volunteers (Teach Learn Georgia) when they come down from the mountains for the weekend, it looks like older volunteers (August 2013) could share what they have learned in the field with newer volunteers (March 2014) using our open source code base called “Learn X” which makes it possible to create an Android App that one or many users can use to create their own language learning lessons together using their Androids to take video, picture or record audio, backed by the FieldDB infrastructure for offline sync. Like heritage learners, TLG volunteers spend their time surrounded with the language and can understand more than they can speak, and what they speak about is highly dependent on their families and what their family speaks about most.

Screen Shot 2014-09-30 at 12.29.54 PM


Installer on Google Play
Open-sourced on GitHub

iLanguageCloud on Google Play

This week Josh released iLanguage Cloud alpha on Google Play. iLanguageCloud is a fun Android application that allows users to generate word clouds using an Android share intent. Clouds can be exported as SVG for use in high-res graphics applications such as InkScape or as png for sharing with friends on Facebook, Google+, Twitter, Flickr or any other social applications a user choses.

Create, save, and share word clouds from text on any website or any another application, share your word cloud with friends and colleagues.

iLanguage Cloud is now available on Google Play
iLanguage Cloud is now available on Google Play

To vote for features, check out the GitHub milestones page.

iLanguage Cloud uses Jason Davies’ D3 word cloud layout engine, you can find his source code on his GitHub repository.

The “Android Market” in Shenzhen

This month Gina travelled to Shenzhen China, a major manufacturing port of China to launch  one of our closed source partners’ Android data collection app. While there she visited the real “Android Market” of Shenzhen.


Shenzhen Wholesale Electronics District
Shenzhen Wholesale Electronics District



The "Android Market" in Shenzhen
The “Android Market” in Shenzhen

UpAndRunningWithAndroid workshop @Google Montreal

Lab member Gina led a hand on workshop at Google Montreal as part of the All Girls Hack Night. The workshop shows how to get “up and running” with Android Intents in a three part tutorial, resulting in a gesture and/or voice remote control Android app.

Over 80 registered for the hacknight.
Over 80 registered for the hacknight.

Try it out! Here are the installers for each step of the workshop:

Step 1: Make it Talk

Step 2: Make it Listen

Step 3: Make it Understand Voice and Gesture

The tutorial is available on Android Montreal’s GitHub,  it extends our Cloud Robotics project from last March.