UpAndRunningWithAndroid workshop @Google Montreal

Lab member Gina led a hand on workshop at Google Montreal as part of the All Girls Hack Night. The workshop shows how to get “up and running” with Android Intents in a three part tutorial, resulting in a gesture and/or voice remote control Android app.

Over 80 registered for the hacknight.

Over 80 registered for the hacknight.

Try it out! Here are the installers for each step of the workshop:

Step 1: Make it Talk

Step 2: Make it Listen

Step 3: Make it Understand Voice and Gesture

The tutorial is available on Android Montreal’s GitHub,  it extends our Cloud Robotics project from last March.

Fork me on GitHub

Making your apps smarter @Notman House

Next Wednesday our software engineering intern Bahar Sateli will be presenting her OpenSource Named Entity Recognition library for Android which is powered by the Semantic Software Lab‘s Semantic Assistants web services platform, which in turn, is powered by GATE, an Open Source General Architecture for Text Engineering developed at the University of Sheffield.

As part of her MITACS/NRC-IRAP funded project in collaboration with iLanguage Lab she created an Android Library to make it possible to recognize people, locations, dates and other useful pieces of text, on Android Phones. The sky is the limit as it can run any GATE powered pipeline.

The current open source pipelines range from very specialized (recognizing Bacteria and Fungi entities in bio-medical texts) to very general (recognizing people, places and dates).

She will be presenting her app iForgot Who which takes in some text, and automatically creates new contacts for you, a handy app for all those party planners out there. It is a demo application to show new developers how they can use her OpenSource system to make their own apps smarter and automate tasks for users.

The presentations start at 6:30, and we will be going out for drinks afterwards at around 8:30/9:30 at Pub Quartier Latin (next to La Distilierie, corner of Onario and Sanguinet, 1 block walk from the talk).

Come one, come all, for the presentation and/or for drinks!

Code is open sourced on SourceForge.

 

The Google+ event

Directions to presentation:

View Larger Map

Directions to drinks:

View Larger Map

 

 

Bahar presents at Android Montreal

Bahar presents to a record breaking crowd at Android Montreal

Dyslexia and dying languages? There’s an app for that.

The Lab’s recent projects get featured on Concordia University’s website.

What do dyslexia in children and endangered languages have in common? Concordia graduate combines her expertise in linguistics and computer programming to tackle both challenges — and more…

http://www.concordia.ca/content/shared/en/news/offices/vpdersg/aar/2012/08/05/dyslexia-and-dying-languages-theres-an-app-for-that.html?rootnav=alumni-friends/news

FieldDB: An on/offline cloud data entry app which adapts to its user’s I-Language.

iLanguage Lab is getting ready to launch FieldDB, a cloud based data entry app created for researchers at McGill, Concordia and University of California Santa Cruz. FieldDB is written in 100% Javascript and uses CouchDB, a NoSQL data store which scales to accomodate large amounts of unstructured data. CouchDB uses Map Reduce to efficiently search across data, a win-win for our clients. FieldDB uses fieldlinguistics and machine learning to automatically adapt to its user’s data. Most importantly, even though FieldDB is a WebApp that runs in your browser, FieldDB can run 100% offline. FieldDB will go into beta testing the first week of July. FieldDB will be officially launched in English and Spanish on August 1st 2012 in Patzun, Guatemala.

FieldDB launch in Patzún Guatemala at CAML.

FieldDB launch in Patzún Guatemala at CAML.

What is FieldDB?

FieldDB is a free, open source project developed collectively by field linguists and software developers to make a modular, user-friendly app which can be used to collect, search and share your data.

Who can I use FieldDB with?

  • FieldDB is a Chrome app, which means it works on Windows, Mac, Linux, Android, iPad, and also offline.
  • Multiple collaborators can add to the same corpus, and you can encrypt any piece of data, keep it private within your corpus, or make it public to share with the community and other researchers.

How can FieldDB save me time?

FieldDB uses machine learning and computational linguistics to adapt to your existing organization of the data which you import and predict how to gloss it. FieldDB already supports import and export of many common formats, including ELAN, Praat, Toolbox, FLEx, Filemaker Pro, LaTeX, xml, csv and more, but if you have another format you’d like to import or export, Contact Us.

What are the principles behind FieldDB?

We designed FieldDB from the ground up to be user-friendly, but also to conform to EMELD and DataOne best practices on formatting, archiving, open access, and security. For more information, see Section 6 of our white paper. We vow never to use your private data, you can find out more in our privacy policy.

Curious how it works? FieldDB is OpenSourced on GitHub

Spy Or Not: data update

Thanks to all users who tried Spy Or Not, a gamified phonetics experiment designed to crowd source rating the imitation of UK British, Russian and South African accents. In two months since its launch the game has been installed by over 400 users on Android.

 

We were excited to see that localizing the app in Russian helped get us Russian speaking participants; 22% of all Android installs were in Russia as well as other Russian speaking satellites.

The Bilingual Aphasia Test: now available on Android

Since last April we have put our Speech Language Pathology interns Émie, Kim and Catherine hard to work learning Git, Gimp, ImageMagick, Sox, FFmpeg, Praat, Eclipse, XML and Java to bring the Bilingual Aphasia Test to touch tablets. The Bilingual Aphasia Test was created by Michel Paradis as well as other members of the Bilingual Aphasia community world wide. The Bilingual Aphasia Test is a normalized test containing 30 subsections to diagnose and treat bilingual/multilingual Aphasia patients. The BAT is available in more than 50 language pairs.

AndroidBAT is an interactive OpenSource application of the Bilingual Aphasia Test (BAT) stimulus book. AndroidBAT is a ‘virtual paper’ of the original BAT paper version. Unlike a computer application, the AndroidBAT is designed to simulate the flexibility of the original paper BAT, with the added benefit of allowing for a diversity of data collection integrated directly into the test. AndroidBAT allows recording of eyegaze and audio during a patient’s interview, without visible external camera or microphone, while providing more analyzable data (e.g., eye-gaze, audio, touch, etc.) than the paper format. Data can then be easily synced or shared with colleagues. AndroidBAT works on tablets and phones.

Want to use the BAT with your participants? You can download the Android Bilingual Aphasia Test on Google Play.

Our interns also braved Javascript and built the Bilingual Aphasia Test Scorer which allows clinicians to enter the data gathered during the BAT and get a patient profile on competencies such as phonology, morphology, syntax, lexicon, reading, writing, speaking, short term language memory and comprehension among others.

Twenty years ago we had two applications to score the BAT (Bilingual Aphasia Test), the PCBAT and the MacBAT. At the Academy of Aphasia annual meeting in October 2011 members of the Bilingual Aphasia Test community got together, and decided to make a web-based BAT scorer, that will run on any computer, on any mobile device, anywhere. Bilingual Aphasia Test Scorer on Chrome Store

Want to adapt the Bilingual Aphasia Test for your participants, or run similar experiments which collect eye-gaze or touch data? The Android Bilingual Aphasia Test is OpenSourced on GitHub

iLanguage Lab Sponsors NAPhC 7

Twelve years after NAPhC1, the lab is proud to become the North American Phonology Conference’s first industry sponsor!

NAPhC 7

Mutsumi Oi, grad student at University of Ottawa won the sponsored door prize of a Praat script customized to her needs. The prize includes commented source code and a screencast explaining how the script works, and how to modify the script to tweek it for future use. Oi has until October 31 2012 to claim her prize, we are excited to work with her and will keep you posted if she decides to OpenSource her script.

Praat is an OpenSource phonetics software by Paul Boersma and David Weenink. It has been used by generations of linguists to automate phonetic analysis, export aligned transcriptions (textgrid), spectrograms, intonation contours and many other visualizations of sound. Praat can be run in a GUI, or on the command line. iLanguage Lab has integrated Praat into many of its Node.js experimentation server projects.

For some sample scripts: http://www.linguistics.ucla.edu/faciliti/facilities/acoustic/praat.html

Spy or Not: week one data

In one week the Gamify project has gotten roughly 700 participants from around the world, including Kazakstan!

Of the new visitors (which we assume are coming to play for the first time) they are averaging 3.4 pages per visit, most are completing the experiment, which takes an average of 5 minutes to complete. We won’t know for a few weeks how many of the participants have usable data.

Spy or Not has seen visitors from around the world, most are viewing all three stages of the game.

Surprisingly, we had a few installs on the Android Market, many of whom also went through all three stages of the game.

Spy or Not installs on Androids also shows activity from around the world, most importantly where we need the most participants Russia, UK and South Africa.

Thanks everyone for playing and sharing our game, our goal is 500 participants from Russia, UK and South Africa. It takes only 5 minutes to spread the word, challenge your friends to beat your score!