Wednesday, November 26, 2008

Announcing Emacspeak 29.0 (AbleDog)

Downloads Reference Installation Usage Tips Tools Support
About the author SourceForge

Emacspeak Inc (NASDOG: ESPK) announces the immediate world-wide availability of Emacspeak-29 --a powerful audio desktop for leveraging today's evolving data and service-oriented Web cloud.

Investors Note

With several prominent analysts initiating coverage, NASDOG: ESPK continues to trade over the net at levels close to that once attained by the DogCom high-fliers of yester-years and as of October 2008 is trading at levels close to that achieved by better known stocks in the tech sector.

Major Enhancements

  1. Speech-enables proced --- a new task manager.
  2. Emacspeak-Webspace for rapid access to content feeds.
  3. Unicode support for enabling the world's various charsets.
  4. Emacs front-end to popular Google AJAX APIs.
  5. Updated g-client with preliminary support for Google Docs.
  6. Updated URL Templates for rapid Web access.
  7. Updated WebSearch wizards for enhanced productivity.
  8. One-shot Google Search with suggestions for word under point.
  9. Emacs 23 support.

See the NEWS file for additional details.

Harnessing Emacspeak

You can visit Emacspeak at SourceForge. The latest development snapshot of Emacspeak is available via subversion from Google Code Hosting. You can subscribe to the emacspeak mailing list by sending mail to the list request address

Press/Analyst Contact: Hubbell Labrador
Going forward, BubbleDog acknowledges her exclusive monopoly on setting the direction the the Emacspeak Audio desktop, and promises to exercise this freedom to innovate and her resulting power responsibly (as before) in the interest of all dogs.


Tuesday, September 09, 2008

Emacspeak Webspace Goodies

Module emacspeak-webspace has a few new goodies on offer. If you activate WebSpace Headlines to obtain a continuously updating ticker of headlines, you may also at times want to find one of the headlines you heard go by and read the relevant article. Command emacspeak-webspace-headlines-view bound by default to C-RET pops up a special Headlines buffer that lists all the currently available headlines. This is a regular Emacs buffer that uses a special major mode called emacspeak-webspace-mode. This mode provides special commands to open a feed at point, follow hyperlinks etc.; use Emacs' online help facilities to learn how this buffer works.

Mode emacspeak-webspace-mode is also used to advantage in browsing information retrieved via the Google AJAX APIs described in the previous set of articles on this blog. Google Reader subscribers can now view the subscription list in a Webspace buffer via command emacspeak-webspace-reader. Additionally, command emacspeak-webspace-google provides a more convenient interface to command gweb-google-at-point --- in addition to speaking the snippet from the first search hit, this command places the first four results in a special Search Results buffer that is put in Webspace mode.

Search And Enjoy!

Wednesday, August 20, 2008

Emacspeak-WebSpace Just Got A Lot Faster

In Praise Of Google AJAX APIS

New module gfeeds.el (part of Library g-client) now implements a Lisp interface to the Google AJAX FeedSearch API. An immediate consequence of this is that module Emacspeak-WebSpace just got orders of magnitude faster --- not that it was slow to start with:-)

Feed And Enjoy!

Thursday, August 14, 2008

In Praise Of The Google Search AJAX API

In Praise Of The Google AJAX Search API

Emacspeak has always provided Google Search with a single keystroke from anywhere on the audio desktop. But with the coming of the Google AJAX Search API it becomes possible to integrate Google Search at a far deeper level into your fingertips! The AJAX API demonstrates the true speed of Google Search, since you dont need to wait for an HTML page to download and render --- results are served as a light-weight JSON data structure.

What You Can Now Do

Module gsearch (part of the g-client package) provides an interactive command gsearch-google-at-point --- I have this bound to key hyper-/ in Emacs. Executing this command from anywhere inside Emacs does the following:

  • Grabs word under point, and prompts in the minibuffer for a search-term --- with the word we just grabbed as the default.
  • Fetches other relevant search terms in the background via Google Suggest, and makes these available via Emacs' minibuffer history mechanism. Use keys M-n and M-p to cycle through these if needed.
  • Hitting ENTER performs a Google Search using the AJAX API, and displays the title and content snippet for the first search result.
  • Executing command gsearch-google-at-point subsequently at the same location opens the first search result.

Search And Enjoy!

Friday, August 01, 2008

Tutorial: Enhancing Web 2.0 Usability Using AxsJAX

You can watch a video of the tutorial Charles and I gave as part of the Google Open Source series on July 14. Emacspeak users can play the video by pressing e e on the above link and specifying emacspeak-m-player-youtube-player when prompted.


Google is the Web's premier creator of user-friendly Web 2.0 applications, and I have long viewed it as part of our mission to do for users in the long tail (AKA users with special needs) what we've achieved for the mainstream user. Accessibility 2.0 is now a hot topic on the Web --- and we would like to move from a world where AJAX applications were a straight No-No with respect to blind users to a world where these same technologies are used to enhance their usability for everyone.

Google-AxsJAX is an Open Source framework for injecting accessibility for users with special needs --- and more generally, usability enhancements --- into Web 2.0 applications. In this TechTalk, Charles Chen and I give a hands-on tutorial on using AxsJAX to enhance the usability of Web 2.0 applications. The tutorial covers the following:

  • A brief introduction to the additional opcodes introduced by W3C ARIA to the assembly language of the Web (AKA HTML+JavaScript).
  • AxsJAX library abstractions built on the above that help Web developers generate relevant feedback via the user's adaptive technology of choice.
  • Steps in creating fluent eyes-free interaction to Web applications, including enabling rapid access to parts of a complex Web page.

This tutorial focuses on solutions we've already built and deployed both within shipping products and as early end-user experiments. Google products that we will cover include:

  • Google WebSearch
  • Google Reader
  • Google Books
  • GMail and Google Talk
  • Google Scholar
  • Google Sky

And time permitting, we might even demonstrate how I now make up for all the time I save thanks to an efficient eyes-free auditory user interface by playing JawBreaker and reading XKCD via their AxsJAXed versions.

Note that writing AxsJAX enhancements to Web applications can help you win bragging rights and cool swag! The goal of this hands-on tutorial is to help you get there faster!

Monday, July 14, 2008

Talk Announcement: Developing Accessible Web-2.0 Applications

For those of you in Silicon Valley, Charles Chen and I will be giving a talk on developing accessible Web 2.0 applications as part of the Google Open Source Series --- see details below. This will be a hands-on tutorial on ARIA-enhancing Web 2.0 applications using Google AxsJAX, and is a follow-up to the talk given at Google I/O . A video of this talk will be posted later on the Web.

Open Source Developers @ Google Speaker Series: Charles Chen & T.V. Raman

Want to learn more about creating accessible Web 2.0 applications from the creators of Fire Vox and Emacspeak ? If you are nearby Google's Mountain View, California, USA Headquarters on Monday, July 14th, please join us for Charles Chen and T.V. Raman's presentation Enhancing Web 2.0 Accessibility via AxsJAX. They will take you through a hands on tutorial on Google-AxsJax , an Open Source framework for injecting usability enhancements into Web 2.0 applications. Among other topics, Charles and T.V. will cover an overview of AxsJAX's developer tools, enabling eyes-free interaction for web applications and iterative design processes for accessibility improvements. They will also let you know the secret to getting a cool t-shirt with the Google logo printed in Braille.

Like all sessions of the Open Source Developers @ Google Speaker Series, this session will be open to the public. Doors open at 6:30 PM and light refreshments will be served. All are welcome and encouraged to attend; guests should plan to sign in at Building 43 reception upon arrival. For those of you who cannot join us in person, the presentation will be taped and published along with all public Google Tech Talks.

Thursday, July 03, 2008

ProcEd: A Speech-Enabled Task Manager For Emacs

For the last 10 years or so, view-process-mode has been my task manager of choice for monitoring and controlling the state of processes on the Emacspeak audio desktop. As of Emacs 23, AKA Emacs from CVS, that module does not work anymore --- in fact it has not been updated for several years. On the positive side, Emacs now bundles module ProcEd --- a task manager that does for processes what module DirEd does for files and directories. As of this morning, proced.el is fully speech-enabled by Emacspeak. You can install module ProcEd for Emacs 22 by obtaining the file from the Web --- you can easily find it via Google.

Share And Enjoy ... And have a great July 4th Holiday!

Thursday, June 12, 2008

Leveraging Web 2.0 Design Patterns For Enhanced Accessibility

As promised, here is a link to the Youtube video of the talk on Leveraging Web 2.0 Patterns For Accessibility given during Google I/O on May 28, 2008 in San Francisco. Emacspeak users can play the video by hitting e e on the link, and specifying emacspeak-m-player-youtube-player when prompted. You can find the downloadable slides used during the talk along with other session material on the Google I/O page for this session.

Talk Details

Leveraging Web 2.0 Design Patterns For Enhanced Accessibility T. V. Raman (Google)

HTML DOM+ JavaScript constitutes the assembly language of Web Applications. Access To Rich Internet Applications --- ARIA --- adds in a couple of additional op-codes for helping Web applications better communicate with adaptive technologies such as screenreaders. How do we now push the envelope with respect to Web applications and adaptive technologies such as screenreaders and self-voicing browsers in a manner similar to what we as Web developers have collectively achieved for the mainstream user? This session will demonstrate programming techniques that help Web developers experiment with and build in the latest accessibility techniques into their Web applications. We will base this session on project Google-AxsJAX. Developers should know JavaScript, but session doesn't require deep AJAX hackery.

Wednesday, May 28, 2008

AxsJAX And Auditory User Interfaces At Google IO

For those of you interested in Auditory User Interfaces and attending Google IO 2008 in San Francisco today, I'll be giving a talk on AxsJAX and Auditory User Interfaces, and be around the rest of the two days to talk about Google's work on access-enabling Web-2.0 applications. Look forward to seeing you there!

Friday, May 16, 2008

Emacspeak On Thinkpad X-61 Running Gutsy (Ubuntu 7.0)

I recently upgraded to a Thinkpad X-61 running Gutsy --- here are some brief notes on the move. In summary, all is well, and I like Gutsy running on the X-61.

Here are things to be aware of both from a hardware and software perspective. All of this is with X and GNOME turned off; note that some of the tips e.g. turning off the display as described here, will cause havoc with X.

  • The X-61 display can be turned off using vbetool
    vbetool dpms off

    for turning off the display, and
    vbetool dpms on

    for turning it back on again.
  • If you suspend to memory etc., make sure to add the appropriate vbetool command to the relevant script in /etc/acpi/resume.d.
  • Gutsy is running ALSA 1.0.15, and to date, I've not had any trouble with the ViaVoice Outloud TTS engine in this environment. Make sure to play with amixer --specifically run command
    amixer controls

    to understand all the various switches and controls exposed by the audio hardware on the X61. Here are some of the things that were noticeably different that are worth mentioning:
    1. The PC speaker can now be manipulated via ALSA.
    2. The X-61 has two input sources. If you plug in an external mike, make sure to set both input sources to the microphone --- as opposed to leaving one or both of them to be either internal mic or mix. Specifically, plugging in an external mike does not disable the internal microphone.
    3. For some bizarre reason, it's possible to turn off the headphone output--- but you cannot control its gain.
  • Kernels later than 2.6.21 do much better with respect to power management, and this really shows on the X-61. With the LCD off, my X61 claims it'll run for 7.5 hours; if you turn off the wireless and USB1.1 as well, it claims it can go for over 8.5 hours.
  • Emacs versions: I'm running out of CVS AKA Emacs 23--- but the emacs-snapshot or emacs22 Ubuntu packages appear to mostly work as well. One irritation with some of the prepackaged bundles of Emacs under Ubuntu is that they dont install the Emacs Lisp sources, and this will bite if you try compiling packages like Emacs/W3.

All in all, the upgrade to Gutsy was mostly painless --- other than having to figure out the usual nits about the new hardware. The /proc/acpi/ibm support is further along but not yet complete--- as an example /proc/acpi/ibm/video does not yet control the state of the LCD --- and you cannot query the state of the display reliably through that interface.

Thursday, May 15, 2008

Emacspeak-28.0 (PuppyDog) Unleashed!

Emacspeak-28.0 (PuppyDog) Unleashed!

For Immediate Release

San Jose, CA, (May. 16, 2007)
Emacspeak: --- Bringing Cutting-Edge Access For Keen Users
--Zero cost of upgrade/downgrades makes priceless software affordable!

Emacspeak Inc (NASDOG: ESPK) announces the immediate world-wide availability of Emacspeak-27 --a powerful audio desktop for leveraging today's evolving data and service-oriented semantic Web.

Emacspeak can be downloaded from Google Code Hosting --see GoogleCode You can visit Emacspeak on the WWW at You can subscribe to the emacspeak mailing list by sending mail to the list request address The PuppyDog release is here. The latest development snapshot of Emacspeak is available via Subversion from Google Code Hosting at

Saturday, April 12, 2008

W4A Keynote: Cloud Computing And Equal Access For All

I'll be giving the opening keynote at the upcoming W4A 2008 conference in Beijing on April 21. You can find an online version of the paper along with the slides here: Cloud Computing And Equal Access For All. Coincidentally, another excellent Web 2.0 accessibility event is happening on the same day in London --- see Accessibility 2.0 --- it's unfortunate one cannot be in multiple places at opposite corners of the globe at the same time!

Tuesday, April 01, 2008

Emacspeak Goes Social

Leveraging The Benefits Of Free Speech!

For Immediate Release:
April 1, 2008

Live From San Jose ... Emacspeak Goes Social!

Investors and users alike welcomed today's announcement that Emacspeak (NASDOG:ESPK) would be going social --- Going Social is better than Going Postal!

As a pioneer in the space of eyes-free information access, and a firm believer in free speech and free software, Emacspeak will now help users go social speech-free --- all users need do is to use the system. When in use, the free-social features of Emacspeak will talk to others on your behalf, answer inane questions, and contribute to the community by in its turn asking even more inane questions of everyone else. In a repeat of the network effect that has led to the resounding success of systems like the World Wide Web and The Blogosphere, these viral features in Emacspeak are expected to win ones running instance many social connections. The longer one uses these features, the deeper one's social graph --- going forward, the information encapsulated in these social graphs will be converted to ever-increasing stacks of small pieces of green paper.

Coming Soon!

As these features are launched over the next few weeks, expect Emacspeak generated conversation streams to show up everywhere ranging from Twitter streams to random email messages that you can usefully use to forward to spammers. This innovative approach to communication finally adds value to spam --- and is being hailed as the next biggest business model to hit the ether. By making such content available on the Internet, the system will foster the long term human goal of organizing and searching all the world's ignorance to make it universally accessible --- thereby bringing ignorance on par with knowledge!

Friday, March 28, 2008

My Web-2.0 Application Is Feeling Accessible

If you feel up to Web hackery and want to win a cool T-shirt in the bargain, see My Web-2.0 Application Is Feeling Accessible!. You can see examples of what you can achieve with this framework in the AxsJAX showcase.

Thursday, March 06, 2008

Emacspeak WebSpace --- Interaction-Free Information Access

A few months ago, I started an Emacspeak module called emacspeak-webspace that is now ready for wider use. The goal of this module is to unobtrusively fetch useful information from the Web and communicate it at those times that one is context-switching among tasks. I gave a talk on user interaction at the last Hackers Conference in November; in the same session, there was another talk whose gist was a plea for less human-computer interaction --- motivation: User Interfaces are nice, but wouldn't it be nice if one didn't have to explicitly interact with the machine to get information? The speaker coined the term Zen interfaces in that context, something that stuck in memory long after the talk.

I built that thought into module emacspeak-webspace. Conceptually, it consists of smart fetchers that fetch information asynchronously from the Web, and smart displayers that communicate this information at appropriate times. These are detailed below.


There are two fetchers at present:

Fetches current weather conditions for your location.
Fetches headlines from a customizable collection of ATOM and RSS feeds.

Note that this module is not intended to be an RSS or ATOM feed-reader; for that, use module greader --- an API-based Google Reader client that is bundled with Emacspeak.

Communicating Useful Information Usefully

With the information in hand, the next question is how does one communicate this information usefully, and what does at the appropriate time mean? Things to avoid:

Do not require explicit user action to hear the information.
Avoid Chatter
Avoid creating an auditory user interface that chatters at the user all the time.

These are conflicting constraints. Notice that in a visual interface, one can meet the interaction-free requirement by displaying the information in a toolbar or sidebar and allow the user to ignore or absorb the information at will.

Emacspeak uses Emacs' header-line to display the continuously updating information. This meets the interaction-free requirement. The header line updates every time Emacs updates its display, and automatically speaking it would produce too much feedback. But Emacspeak doesn't automatically speak the header-line; it only speaks it when there is a context-switch.

How To Use

Here is how I am using emacspeak-webspace at present:

Activate weather display in the calendar and scratch buffers.
Activate feed headlines in selected shell buffers.

You hear the updated information when switching to buffers where the webspace display is active.

Activating WebSpace Displays

Webspace displays are activated via the following commands; all Webspace displays will be placed by default on prefix key hyper-space

  • emacspeak-webspace-headlines: hyper-space h
  • emacspeak-webspace-weather hyper-space w

Share And Enjoy, And May The Source Be With You!

Thursday, January 03, 2008

Announcing: The Coming Of Piglets To The Emacspeak Desktop

This is to announce a new emacspeak module called Piglets that brings together Emacs and Firefox to create a powerful framework for authoring Web interaction wizards.

Why Piglets?

You might well ask Why Piglets?, and might conjecture that PIGLETS might stand for Powerful Internet Gadgets for a Light-Weight Talking System. You might conjecture that the Emacspeak mascot likes pig-ears; or you might even think of attributing it to the fact that my friend and colleague Charles Chen and creator of Fire Vox was born in the year of the pig. But you'd be mostly wrong in all of the above.

Piglets on the Emacspeak desktop are the result of having two large (and powerful) software pigs connect over a socket. A few months ago, I blogged here about MozREPL and how it allows me to Put The Fox In A Box. Piglets mark the completion of the Emacs/Firefox integration that started with Firebox. Once you install Fire Vox, the free self-voicing extension for Firefox, piglets become a versatile means to leverage the self-voicing Fire-Vox/Firefox DOM from the comfort of the emacspeak environment.

What You Need

Caveat: ALL of this is early experimental software --- and you'll need to tweak things for your environment to get things working.

  • A version of Emacspeak from the subversion repository.
  • Check the installation of the servers/python files in your Emacspeak installation.
  • Confirm that the HTTP wrapper for the TTS servers works. You can most easily do this by running:
    # Start the HTTPspeech server 
    # from emacspeak/servers/python
    python  outloud 2222 &
    # you can replace outloud with dtk-exp
    # but the bindings to other TTS  servers is not defined for now
    # Run wget to test the speech server:
    wget --post-data='speak:hello world' localhost:2222
    If all is well, you should here the TTS engine say Hello World

    Do not proceed if the above does not work.

  • Install Fire-Vox and configure it to use the Emacspeak TTS server. You can do this with ORCA providing the feedback. Alternatively, once you have installed Fire-Vox, shutdown Firefox and then edit your ~/.mozilla/firefox/default/prefs.js
    user_pref("firevox.LastWorkingTTS", 4);
  • The above sets up Fire-Vox to use the running HTTP speech server you started earlier.

Loading And Running Piglets

The Piglets framework is implemented in module emacspeak-piglets.el. There is a Fire-Vox binding in module emacspeak-firevox.el and a binding to the JawBreaker game in emacspeak-jawbreaker.el.

How Does It Work?

When you get the various pieces configured and working, here is how things work:

  • Piglets place you in a special interaction buffer in Emacs.
  • Typing keys in this buffer go to Firefox.
  • Control keys send commands to Firefox using MozREPL.
  • The latter is most useful in conjunction with Web 2.0 applications that have been AxsJAXed.
  • Additional commands give access to FireFox features such as the URL bar. As an example, hit C-w and type the phrase you'd like to go into the address bar --- either a URL or a search term. As an example, try typing XKCDComic. This will automatically do a Google Lucky Search (thanks to FireFox magic) and open the XKCD site.
  • But wait, there's more! Because XKCD has been AxsJAXed, you will hear Fire-Vox automatically speak the comic strip and its associated transcript. Hit ? in the FireFox interaction buffer to hear the available keystrokes for this AxsJAXed application; in general, ? brings up keyboard help for AxsJAXed applications.


These are some todos that I plan to get to eventually --- if you have coding cycles to contribute, feel free to work on these.

  • Create an HTTP binding to the TTS servers using TCL and the TCL HTTP libraries. This will eliminate the dependency on the Python wrapper I originally wrote for ORCA in fall 2005.
  • Write more Piglets.
  • Make installation and configuration simpler.
  • Test installation and configuration of the various pieces.