EvoHaX and the Accessibility MacGuffin

A few weeks ago I participated in EvoHaX, an accessibility hackathon which happened as part of Philly Tech Week. Ather Sharif of EvoX Labs did a wonderful job organizing it. I had other commitments during the main coding day, so we compromised and made me a judge. I also gave a little speech poking fun at their prize of a Google Chromebook. I enjoyed the experience and feel glad they have already said they will do it next year.

I find it funny that I have helped plan two accessibility hackathons, but have not written a single line of code for either. I’ve had other accessibility-related commitments. Last year I spoke at the annual RubyMotion developer’s conference, and this year I gave a workshop at the University of the Arts as part of our new business called Philly Touch Tours, more on that soon. I met with Ather and the other planners and we went over the whole event. Ather had an interesting idea to pair groups with a random subject matter expert, in other words a user with a disability. This mirrors the real world – you never know when you will suddenly have to face a challenge.

On Friday April 19th the event began. Benjamin’s Desk hosted it. We listened to several informative speeches. One professor specialized i rendering infographics for screen readers. A cool topic for sure, but he kept asking “What do you see in this graph?” I wanted to yell out “Nothing!” In my head I heard my high school geometry teacher saying “You’re not much of a visual learner, are you.”

On Sunday I rolled in for the judging. I met the other judges and experts. I also saw my friend Faith Haeussler and her very cute kids who know the word hackathon. Everyone had finished coding. I got out my MacBook Air and prepared to begin.

Before I continue I have to explain something which it seems a lot of people don’t know. Blind people tend to not use Google products. Google has become synonymous with second rate accessibility. iOS dominates the mobile and tablet space. None of my blind friends use Droid, and I mean that literally. Zero! For the desktop we use Windows or Mac OS and their respective screen readers. I don’t know anyone who uses Chromevox. Personally I use a Mac with VoiceOver and Safari for my browser. When designing something for the blind you must remember the platforms used by the blind.

Because of this, I couldn’t get over the prize of a Google Chromebook for each member of the winning team. It really depressed me. For a few days I lay around, lamenting that I would have to participate in an accessibility hackathon that gives away Google Chromebooks as prizes. The world will end! Then I pulled myself together and remembered that the prize doesn’t really matter, all the wonderful inspiring work does. This gave me a great idea for a speech. I composed it in my head as I waited to judge the entries.

First up, West Chester University wrote a Chrome plugin called Jumpkey to easily navigate to common places on a web page, such as the home or contact links. Interesting concept. They brought over a MacBook Pro running Chrome with Chromevox, which I had never used. It started talking in a goofy Google voice which made me laugh. I figured out a few keys and the plugin worked. One of the authors told me he could port it to Safari in an hour. I hope he does.

Next Lasalle University demonstrated their project, a browser framework called Blind Helper. They admitted they needed to find a better name. Fortunately this one worked with Safari. They designed a system for crowd sourcing image descriptions and rendering them as alt tags. I liked the idea, and the demonstration worked. However, their logo didn’t have an alt tag, and the form fields did not have labels. It struck me as rather ironic. When coding an accessible platform you should make the platform accessible! They lost a point or three for that. Still it has potential.

Next, an all female team of hardware hackers from Drexel stole the show with their speech reader bluetooth module. They designed it for those with cognitive difficulties, but it has other uses as well. They used an Arduino with some other components. They even tested it with NVDA, the popular free screen reader for Windows. Excellent!

St. Joe’s presented a browser plugin for those with dyslexia to place icons next to familiar words. This helps their brain figure out the proper word by giving it some context. They could even make it multi-sensory. I couldn’t use it so couldn’t really comment, but I like the idea.

Finally, Swarthmore College presented a visual data representation of the Youtube videos which have captions, or rather the lack thereof. I couldn’t see the graph but they could render it in textual ways. I also grew up in Swarthmore so wished them well.

To vote, EvoX Labs wrote a little web app for the judges. And yes, they made it accessible. I filled out my form and Faith read the results. After congratulating everyone we made speeches. I called mine The Accessibility MacGuffin. A MacGuffin refers to an object which drives the plot of a story. The object itself doesn’t matter, the story around it does. For example, the briefcase in the movie Pulp Fiction doesn’t really matter. We never know its contents. We only know that some gangsters have to retrieve it and protect it for their boss, using some rather extreme means to do so. This graphic scene demonstrates the power of a macguffin. Pay attention to the briefcase!

I didn’t know how people would feel about making fun of the prize, but it went over well. I hope the participants will think about accessibility in all their projects. I also hope they continue developing the projects started at EvoHaX. See you next year! Maybe I’ll actually get to write some code.

OpenWRT Barrier Breaker with Verizon FIOS

As I detailed at great length, I have FIOS, Verizon’s fiber optic internet service. I have never liked using stock firmware mainly for accessibility reasons, but also because we just don’t know what it may contain. For a while I used Tomato, but have decided to switch to OpenWRT. I thought I’d have an easy time porting over my network’s settings and that I could then continue on my merry way. I thought wrong.

I use AirPlay to stream music around my condo. It has worked well, but I started getting increasing amounts of dropouts. I decided to upgrade my router to use 5 GHZ since it has less interference. Plus it could hurt to run newer firmware. After some research I purchased the TP-Link Archer C7 because it supports 802.11AC and has four ethernet ports. I still like good old wired ethernet when possible.

The device arrived quickly and I went to work setting up OpenWRT. I went to the router’s stock firmware’s default address of 192.168.0.1. I found the upgrade button. A dialog box popped up: “Are you sure you to upgrade?” Yes! Definitely!

I installed OpenWRT without incident. I setup my password and some basic settings. I enjoyed using the command line to do everything. Then I got to the part where I set up my internet connection and hit a dead end. As it turns out, Verizon FIOS has a weird issue with their DHCP lease, and some other nasty surprises. I reproduce the following instructions in hopes they will help another. I also hope they integrate this fix into the next release. This works with Barrier Breaker 14.07.

Firstly, and most importantly, you must have the MAC address of a currently working router. Copy this down before continuing.

Download the version of OpenWRT appropriate to your router. In addition, download the patch utility, which you find in the packages subdirectory. It will have a filenamename like patch_2.7.1-1_r71xx.ipk. You will also need the patch from this forum thread. Remember to get the latest patch, the thread has several. As of this writing it has the filename dhcp.sh-150411.sh.

So just to recap, at this point you should have a working MAC address, the version of OpenWRT appropriate for your router, the patch utility, and the patch from the forum thread. Install OpenWRT as detailed in the wiki and stop when you get to setting up your internet.

Now use scp to copy the patch utility and the patch itself to your /tmp directory.

$ scp patch*.ipk dhcp.sh*.patch root@192.168.1.1:/tmp

Now ssh back into your router and apply the patch:

# cd /lib/netifd/proto
# patch -p0 -b /< tmp/dhcp.sh-150411.patch

The patch should succeed. If it fails restore the backup file. Assuming that worked, edit your /etc/config/network and configure your WAN:

config interface ‘wan’
option ifname ‘eth0′
option proto ‘dhcp’
option macaddr ‘MM:AA:CC:AA:DD:RR’ # replace with the MAC address you copied earlier
option clientid ‘noc’
# Use alternate DNS servers so Verizon won’t spy on you
list dns 71.242.0.14
list dns 71.250.0.14

Save and run:

# /etc/init.d/network restart

and hopefully everything will work. I went through so much to get this working that I had to write it down. Requiring a working MAC address really threw me for a loop. I even used the one from my previous third party router, and it still worked. So there you go, enjoy blazing fast FIOS on your awesome new router.

RubyMotion, the Apple Watch, and Accessibility

Check out this talk I gave at Philly Cocoa, a meetup for Mac and iOS programmers. It covers what we know about Apple Watch’s accessibility (not much) and shows how to build an Apple Watch extension in RubyMotion. For more information check out Apple’s WKInterfaceObject reference. I also released a new version of Motion-Accessibility with initial WatchKit support.

Apple has created the first accessible wearable but has said nothing, preferring to talk about the beautiful gold color of their new laptop or how Tim Cook wishes someone would send him some heartbeats. They made quite a presentation of the health benefits the Apple Watch will bring, and I believe accessibility fits into this theme. Right now we have only rumor and reference. Hopefully all this will change shortly. I can hardly wait!

On The Pulse for a Second Time

At the end of 2013 I appeared on a radio show called The Pulse, which airs on WHYY. I really enjoyed the experience of broadcasting, and hoped one day to do it again. A few weeks ago a reporter from the Pulse named Todd Bookman contacted me and asked if I would like to participate in a piece about web accessibility. Of course I said yes. It airs on Friday, February 6 at 09:00 AM, in about seven hours from the time of publishing. You can always listen to it online.

Several factors gave the interview a strange feel. I didn’t want to mess with my Nest thermostat because the Nest app has marginal accessibility. Since I couldn’t turn off my loud heating, we had to conduct the interview in my bedroom with the door closed. I had just received some Gotu Kola extract, a nootropic which nourishes the brain. I took two drops a half hour before the interview, and it made me feel very focused and smart. I forgot how much I enjoyed this herb.

I just rambled on about the good old days when we used text for everything and how my Mom remembers buying my Apple II/e which still works and how much I miss the BBS scene oh and by the way Todd wrote a beautiful article about an old BBS. Then we got into the World Wide web with all of its clutter and confusion. I had to discuss a technical topic as non-technically as possible which I found challenging. Developers can use special tags to convey certain information to screen readers, such as an alt tag to describe an image. Use standard elements when possible then fall back on ARIA. Try navigating your site with a keyboard. Try a tool such as Wave from WebAIM. If you use RubyMotion then try my own gem Motion-Accessibility. Think about accessibility as early as possible in your project. Good standard advice.

He recorded over a half hour of audio. The final piece gets a lot of the good parts. He found a demo of the old Echo II which brought back of good memories on Twitter. Mom described hearing it as like hearing an old song, and I agree. It takes you back in the same way. On another interesting note, Angel, the stut from Overbrook, came on our touch tours at the Penn museum. Small world.

I feel honored that the Pulse chose me to appear for a second time. Perhaps this serves as the second step in my broadcasting career. If you’d like to learn more about accessibility, follow the links in this article, and if you need consulting work just ask! And again, you can listen to the piece here.

I Lived on Soylent for a Day

At the Indy Hall Fourth of July barbecue I heard about a drink called Soylent, an open source meal replacement. It takes its name from the movie Soylent Green, but unlike the movie this just has the raw ingredients a human needs to live. I have always wanted something like this so ordered some immediately. It finally arrived five and a half months later, so I decided to try living on it for a day. I felt amazing!

The summer seems so far away now. In the warm and wonderful sun the members of Indy hall and their friends and families gathered at Liberty Lands Park to celebrate the co-working space and the community around it. At some point I got into conversation with a guy named Keith. He told me of a drink invented by a Silicon Valey startup founder because he wanted to spend less time preparing and eating food. I’ve always fantasized about such a thing so became instantly interested. I also appreciated the irony of finding out about it at a barbecue, which had excellent vegetarian food by the way.

I remembered the product name because of the movie so looked it up when I got home. It looked legitimate. It also appealed to my open source nature. Making the formula available to all makes it better for everyone. It means that more experts can evaluate it. The whole project has the feel of a software project, with version numbers and a change log.

I ordered a week’s supply, consisting of seven bags of powder, seven bottles of oil, and a starter kit. The starter kit consists of a pitcher and a metal scoop. The web site gave me fair notice that it would take a long time to arrive, 4-6 months. I didn’t mind since I viewed it as an experiment. It arrived five and a half months later, but re-orders will come within 1-2 weeks.

I woke up with a headache. I felt nasty and had to host an accessibility party later that night. I just wanted to eat something bland that would get me feeling better. I received a notification that my package of Soylent had arrived, and sure enough I found it after all this time, and on the perfect day. It really doesn’t get any simpler to make. They provide a digital version of the booklet which they send, a plus for accessibility. The time had come.

To make soylent, just pour a bag of powder in the pitcher, fill mostly full with water, shake for thirty seconds, add the oil and top off with water, and shake for thirty more seconds. Done. You can also use the custom cup for a single serving. Just combine a cup of powder with a teaspoon and a half of oil and two cups of water. It all seemed too easy.

I decided to start with a single serving. I put a cup of Soylent V1.3 into a glass, added water and stirred, then used a measuring spoon to measure the oil and stirred it in. I let it chill in the fridge since it tastes better when cold. An hour later I tried it. It smelled and tasted like tapioca pudding. It had a slightly milky consistency because of the oil. It tasted neutral but good. I didn’t mind it at all.

I finished the glass and before I knew it I felt full. It didn’t feel fake, I felt really full. An hour later I had a little swig for the road. It hit my stomach and my body knew it had enough. I still felt a little weird, since I hadn’t eatn a hot meal. I hoped it would still effect me like one. At least my headache had mostly disappeared. I felt impressed enough to make the rest of the pitcher so I’d have it when I got home.

At the party I started greeting people and felt fine. I didn’t feel hungry or like I had missed a meal at all. In fact, I felt like I had eaten a rather large one. Several people brought food, but I didn’t feel like anything. I asked for a few desserts and snacks to nibble on. I started eating a cookie then realized that it felt painful. I had failed to take into account that Soylent would also take away my desire for dessert! I ate a chocolate candy and thought of that classic scene from Monty Python’s the Meaning of Life, where a fat guy orders and eats everything on a restaurant’s menu, then eats an after dinner mint and explodes. “It’s wafer thin!”

I came home a few hours later. The feeling of fullness had started wearing off, so I had half a glass, a snack’s worth. It hit me and again I felt really full. I didn’t need anything else the rest of the night. I went to bed feeling content. My stomach felt full and I had eaten nothing. How strange and wonderful.

The next day I woke up and still felt fine. I didn’t feel overly hungry or like I had done anything wrong. In fact I felt really good. I decided to carry on the experiment for a full day. I had a glass for my breakfast and another half a glass later in the afternoon. I didn’t feel hungry at all. I did feel a little bloated, but they warned this would happen at first due to getting a proper amount of fiber, something most of us eat too little of. Other then that i have no complaints. I do feel concerned that it contains GMOs, but hopefully they will address that.

I decided to end the experiment after a day. For dinner I had tempura tacos. I felt unusually sensitive to the fried food, though enjoyed it. It felt good to taste flavors again. Eating solid food felt like returning home from an alien world.

I love cooking and would never want to give that up. Food connects us to the Earth and to our humanity. Blue Apron has provided me with exotic dinners, and I’ve always enjoyed good breakfasts. Still, when I can’t prepare a quick healthy meal I will use this. Soylent doesn’t need to replace food, but it can provide a replacement when necessary. It tastes and feels man made, but it may save humanity. You can find more info at http://soylent.me.