Category Archives: Android

Making your apps smarter @Notman House

Next Wednesday our software engineering intern Bahar Sateli will be presenting her OpenSource Named Entity Recognition library for Android which is powered by the Semantic Software Lab‘s Semantic Assistants web services platform, which in turn, is powered by GATE, an Open Source General Architecture for Text Engineering developed at the University of Sheffield.

As part of her MITACS/NRC-IRAP funded project in collaboration with iLanguage Lab she created an Android Library to make it possible to recognize people, locations, dates and other useful pieces of text, on Android Phones. The sky is the limit as it can run any GATE powered pipeline.

The current open source pipelines range from very specialized (recognizing Bacteria and Fungi entities in bio-medical texts) to very general (recognizing people, places and dates).

She will be presenting her app iForgot Who which takes in some text, and automatically creates new contacts for you, a handy app for all those party planners out there. It is a demo application to show new developers how they can use her OpenSource system to make their own apps smarter and automate tasks for users.

The presentations start at 6:30, and we will be going out for drinks afterwards at around 8:30/9:30 at Pub Quartier Latin (next to La Distilierie, corner of Onario and Sanguinet, 1 block walk from the talk).

Come one, come all, for the presentation and/or for drinks!

Code is open sourced on SourceForge.

 

The Google+ event

Directions to presentation:

View Larger Map

Directions to drinks:

View Larger Map

 

 

Bahar presents at Android Montreal

Bahar presents to a record breaking crowd at Android Montreal

Fork me on GitHub

Dyslexia and dying languages? There’s an app for that.

The Lab’s recent projects get featured on Concordia University’s website.

What do dyslexia in children and endangered languages have in common? Concordia graduate combines her expertise in linguistics and computer programming to tackle both challenges — and more…

http://www.concordia.ca/content/shared/en/news/offices/vpdersg/aar/2012/08/05/dyslexia-and-dying-languages-theres-an-app-for-that.html?rootnav=alumni-friends/news

Spy Or Not: data update

Thanks to all users who tried Spy Or Not, a gamified phonetics experiment designed to crowd source rating the imitation of UK British, Russian and South African accents. In two months since its launch the game has been installed by over 400 users on Android.

 

We were excited to see that localizing the app in Russian helped get us Russian speaking participants; 22% of all Android installs were in Russia as well as other Russian speaking satellites.

The Bilingual Aphasia Test: now available on Android

Since last April we have put our Speech Language Pathology interns Émie, Kim and Catherine hard to work learning Git, Gimp, ImageMagick, Sox, FFmpeg, Praat, Eclipse, XML and Java to bring the Bilingual Aphasia Test to touch tablets. The Bilingual Aphasia Test was created by Michel Paradis as well as other members of the Bilingual Aphasia community world wide. The Bilingual Aphasia Test is a normalized test containing 30 subsections to diagnose and treat bilingual/multilingual Aphasia patients. The BAT is available in more than 50 language pairs.

AndroidBAT is an interactive OpenSource application of the Bilingual Aphasia Test (BAT) stimulus book. AndroidBAT is a ‘virtual paper’ of the original BAT paper version. Unlike a computer application, the AndroidBAT is designed to simulate the flexibility of the original paper BAT, with the added benefit of allowing for a diversity of data collection integrated directly into the test. AndroidBAT allows recording of eyegaze and audio during a patient’s interview, without visible external camera or microphone, while providing more analyzable data (e.g., eye-gaze, audio, touch, etc.) than the paper format. Data can then be easily synced or shared with colleagues. AndroidBAT works on tablets and phones.

Want to use the BAT with your participants? You can download the Android Bilingual Aphasia Test on Google Play.

Our interns also braved Javascript and built the Bilingual Aphasia Test Scorer which allows clinicians to enter the data gathered during the BAT and get a patient profile on competencies such as phonology, morphology, syntax, lexicon, reading, writing, speaking, short term language memory and comprehension among others.

Twenty years ago we had two applications to score the BAT (Bilingual Aphasia Test), the PCBAT and the MacBAT. At the Academy of Aphasia annual meeting in October 2011 members of the Bilingual Aphasia Test community got together, and decided to make a web-based BAT scorer, that will run on any computer, on any mobile device, anywhere. Bilingual Aphasia Test Scorer on Chrome Store

Want to adapt the Bilingual Aphasia Test for your participants, or run similar experiments which collect eye-gaze or touch data? The Android Bilingual Aphasia Test is OpenSourced on GitHub

Spy or Not?

So you think you would make a good spy?

Emmy and Hisako are proud to present the release of Spy or Not, a gamified psycholinguistics experiment made in collaboration with the Accents Research Lab at Concordia University headed by Dr. Spinu.

It is commonly observed that some people are “Good with Accents.”  Some people can easily imitate various accents of their native language, while others appear struggle with imitation.  This research is dedicated to building free OpenSource phonetics scripts to extract the acoustic components of native speakers and “Good with Accents” speakers to transfer the technical details in a visualizable format to applied linguists on the ground who are working with accented (clinical and non-native) speakers.

In order to collect non-biased judgements from native speakers, a pilot study was designed and run by Dr. Spinu and her students. Images and supporting sound effects were created and the perceptual side of the pilot was disguised as the game “Spy or Not?” The game has since gathered over 8,000 data points by crowdsourcing the judgements to determine the degree (on an 11 point scale) of which participants were “Good with Accents.” This a novel approach to the coding problems that experimenters frequently encounter.

Participation in this project furthers research in phonetics and phonology in addition to experimental methodology in the age of the social web. Our hope is that our readers will Tweet their “Good with Accents” scores and help us get more participants, especially native speakers of Russian English accents, Sussex English accents and South African accents, accents we could never access at the scale we need in a lab setting. Visit the free online game, or play offline by downloading the game at the Chrome Store or on Google Play as a Android App.

Giving robots Eyes and Ears with Android

iLanguage Lab members Theresa and Gina formed part of the Roogle Team at the Cloud Robotics Hackathon 2012. At the hackathon the team worked on two robots, a Darwin-OP robot, a humanoid robot running Ubuntu Linux, and a Rover robot with Arduino controlling movement and an Android as “eyes.”

You can get the code and the Android installer at http://code.google.com/p/roogle-darwin/

The Bacteria Detecto-Droid Team gets Featured in Montreal TechWatch

The Bacteria Detecto Droid team was recently featured in the Montreal TechWatch. The project was built for researcher John Feighery’s Portable Microbiology Kit by a team of 5, including iLanguage Lab members at  Random Hacks of Kindness Montreal back in December. For more updates, checkout the Montreal TechWatch article “Portable Microbiology Lab – There’s an App for That!

Android phones capable of taking these pictures cost less than 120$. Combine this with affordability of ‘Portable Microbiology kits’, that can be incubated using body heat, and we may end up with a sustainable solution to help fight water problems that plague many parts of the world.

 

The Portable Microbiology Lab featured on Montreal Tech Watch

iLanguage Intents, coming soon to an Android near you…

The Lab is brewing up some open iLanguage Intents for Android developers to call GATE Natural Language Processing pipelines using the Semantic Assistants Architecture. We are working on both eLanguage (English, French, Inuktitut, etc) and iLanguage (fieldlinguistics, machine learning) intents. Curious what intents are? Check out the list of currently known Open Intents curated by OpenIntents.org

A diagram showing how Intents achieve a loosely decoupled, modular and open environment.

If you’re an Android developer and you want to join our Beta testing drop us a line:

The Bacteria Detecto Droid wins at Montreal RHOK

    iLanguageLab member Gina was part of the “Bacteria Detecto Droid” team which won best use of technology at Random Hacks of Kindness Montreal. Check it out in the 24 Heures!

    …un des deux projets gagnants, un laboratoire bactériologique portable. «C’est une application pour téléphone intelligent qui détecte les bactéries présentes dans l’eau, a-t-il expliqué. Elle est capable de dire si l’eau est dangereuse ou non. Le système accumule les données pour produire une carte».

    «On a montré que ça pouvait être fait par un téléphone, s’est réjoui M. Grassick. Ce n’est pas cher et ça peut être utilisé par des locaux dans des pays en développement.»

    The project was one of over 30 winning RHOK projects around the world, we were also mentioned in the The World article “Geeks without Borders”!

    Other projects were equally ambitious. In Portland, developers created an application to allow medical workers to track disease outbreaks in real-time. In Bangalore, hackers built a job database for unskilled workers. In Montreal, developers created an app that can scan a microscopic photo of bacteria taken from water to test for drinking safety—a key tool for poorer countries.

    The code uses OpenCV, an Computer Vision library to process the images on the Android Client, check out the code on GitHub.