Whither Amaya?

It’s been over a year since the W3C released a new version of its Amaya prototype XHTML editor.  This is not good.  I’d go so far as to call it ++ungood.  With apologies to Daniel Glazman and the many people who’ve worked on Composer-based tools, I’ve long preferred and used Amaya when it comes to writing HTML these days.  It’s had support for MathML and SVG forever, and its UI provides a lot of power without being in the way.  Oh, and it works on Windows, Linux and Macintosh… kind of like another code base I also love…

I realize Mozilla’s probably in no position to fund Amaya development, and I don’t believe they should.  I’m posting this to call attention to this project I find extremely useful, and inspirational (for Verbosio, my XML editor project).  Maybe someone out there among the W3C members or the global software community can help out a bit.

Three operating systems, three purposes

I’ve stated before that I work with Mozilla code on three different operating systems.  My desktop supports Fedora Core 15 and Windows 7.  My MacBook runs MacOS 10.6.  These three give me coverage on all three platforms.

That said, I realized a few minutes ago each system serves a different purpose for me:

  • With Fedora, I get fast compile times, especially incremental builds, and fast test execution.
  • With Windows, I get a really good free debugger with Visual Studio Express.
  • With Mac, I get portability:  I can write my code on the go.

It’s actually a pretty good situation, I think.  Is anyone else in a similar situation?

Why I’m attending college: trying to explain my thinking

A couple months ago, I started taking classes at Chabot College.  One reason is to get the degree that shows what I’m talking about.  But there’s another reason, even more fundamental:  filling in the gaps in my own knowledge.

Case in point, I’ve been thinking long and hard about building a JavaScript-based DOM.  The problem I’m facing is that to do everything I want – shadow content, undo/redo transactions, etc. – I need a specialized data structure.  Specifically, I need a data structure much like a multi-dimensional array or hashtable.

(The first dimension would be a set of object keys – the DOM nodes.  Another dimension represents undo history, like a timeline.  A third dimension could be the shadow content.  I could define other dimensions might exist for building an individual transaction until it is completed, or otherwise creating a “workspace” for experimenting.)

About 24 hours ago, I had an idea, related to my multi-part keys for hashtables implementation.  Typically, in designing a data structure class, I think about how to give each piece of data an address first, then I implement ways to store data by that address.  The idea is to flip that approach around:  to define an API that lets the user place an object with an address, and then add new dimensions to the address space as the user needs them.

If I’m not making any sense, that’s fine.  In fact, that’s why I’m writing this blog entry.  I’m having lots of trouble articulating last night’s idea formally.  I can see it crystal-clear, and I can write JavaScript to implement it, but I don’t have the language to describe it yet in English.  I spoke briefly with my Calculus instructor tonight, to figure out what area of mathematics my idea might fall into, and he suggested linear algebra (that my idea relates to vectors in some way).  Which I can’t take a class in until I complete Math 1 and Math 2 (both are Calculus classes; I’m enrolled in Math 1).  The name of the linear algebra class, at Chabot, is Math 6.

This underlines why I’m going to college after at least six years as a professional software developer.  It’s a gap in my knowledge.

Some people, like me, enter this field with a talent built upon years and years of tinkering, of experimenting and of thinking.  The software industry can really take a person like that quite a ways.  Others enter the industry having taken computer programming courses – and that’s really hit or miss for an inexperienced person.  (No offense intended to the software engineers who started through college!)

I wonder if taking up college classes after you’ve been in the industry a while is actually the best approach of all:  continuing the education, helping clarify what you’re working on and expanding the imagination with new possibilities.

I wonder how many current software engineers have decided to go back to college after being in the field a while, to push themselves and their capabilities even further.

Jarring web content?

Many years ago, the Mozilla Application Suite implemented support for a jar: protocol.  This protocol states you can refer to ZIP-compressed files inside an archive served as application/jar-archive or application/x-jar.  A typical URL would look like jar:https://alexvincent.us/somepath/target.jar!/test.html.  The idea is that you download one file, target.jar in this case, and your browser extracts the rest of your files from that URL.

Today, Mozilla Firefox uses it as part of how they store their user interface (omni.jar these days) on the local file system.

It’s a pretty interesting concept.  There’s been a large push in recent years towards JavaScript minification, to reduce download sizes.  This is especially true as scripts have recently ballooned to hundreds of kilobytes.  If you have a lot of JS, HTML, CSS, XML, and other plain-text files that rarely change, it might be worth putting them into a single JAR for users to fetch – you get additional compression on top of your minification, and there’s one HTTP request instead of several.

With one major catch, though.

As far as I can tell, Mozilla Firefox is the only major browser to support the jar: protocol.  This is not news.  Even Google Chrome has not implemented support for the jar: protocol.

Naturally, Firefox’s history hasn’t been perfect with the jar: protocol.  They did fix the one issue I found publicly available about it, and wrote about it on MDN. I’m not aware of any other issues, so theoretically they should be safe to use.

I’m thinking about using this capability on my website.  For my “Visualizing the DOM” series I’m developing, I have a lot of JS files that I’m using, including some library files.  Ogg Vorbis doesn’t compress so well (on a 9 minute audio I received a 1% benefit), so I won’t include that.  Alas, if I lock out all the major browsers except Firefox, I’m not going to be too happy.  The good news is that the HTML file which loads my “theater” application is PHP-generated – so based on the user agent, I can probably send the user a JAR archive (to Firefox users) or the uncompressed files directly.

Comments welcome!  Particularly if you think this is a good or a bad idea.

Making a hash of things, part 2: Multi-part keys

Last year, I wrote about hash tables for JavaScript. This was before I knew about the plans to implement a native hash table for JavaScript called WeakMap in Mozilla Firefox.

WeakMaps are awesome.  They completely obsolete the need for that hashStringKey hack I came up with last year.  When you combine them with JavaScript proxies, you can get something even more awesome called a membrane.  (I didn’t make up the name.)  Through the membrane you can ensure a proxy either returns a primitive value (number, string, boolean, etc.), or another proxy.  In fact, if two underlying “native” objects A and B would refer to each other, through the membrane you can ensure the proxies for them, pA and pB, refer not to A or B, but to each other:  pA.b == pB, and pB.a = pA.

A couple things you can’t do with WeakMap

First, keys cannot be primitive values.  You can’t say:

var map = new WeakMap;
map.set("attributes", []); // throws exception because "attributes" isn't an object.

It just doesn’t work.  I asked Jason Orendorff about it, and the reason has to do with garbage collection.  Simply put, weak maps hold references to their keys loosely:  when no one else knows the key, the JavaScript engine can safely erase the key and the value.  With objects, that’s easy:  they’re unique.  When you copy them, you get a distinct object that isn’t equal to the original.  With primitives like a simple string, that’s not so easy:  you can lose all reference to the original string in memory, but hard-code that string elsewhere.  The weak map would have to remember it.  WeakMap currently deals with the problem by forbidding primitive keys.

Second, it’s one key to one value.  That’s what hash tables are, and what they generally should be.  But there’s really no concept of a two-part key, as there is in public key encryption.  Nor a three-part key, nor an n-part key.  So there’s no way to say that any two keys are related to each other.  Think of a two-dimensional grid: each cell has a row and a column.  The row and the column combine to form a key where you can look up a value for the cell.

My very experimental solutions

For the first problem, I implemented a brute-force primitive-to-singleton-object class, PrimitiveKeySet. It creates an object for every primitive it sees (thankfully, you have to pass it a primitive first), and returns that object. I also implemented a WeakMapWithPrimitives function, which leverages PrimitiveKeySet and wraps around the WeakMap API.  It doesn’t solve the memory leakage problem – nothing can, really – but it at least lets me use primitive keys.  I also tried to be a little smart about it:  when you tell it to delete a primitive key, it really does.

For the second problem, I did a little bootstrapping, using a tree analogy.  I started with a WeakMap (the “root map”).  I used the first part of the key to assign another WeakMap (call this a “branch map”) as a value stored in the root map.  I would use the second part of the key to assign another WeakMap to the branch map.  I repeated this over and over again until I reached the last part of the key and the last WeakMap I needed (the “leaf map”, for lack of a better term). At that point I assigned the user’s value to the last key part, on the leaf map.

I could easily say, then:

var map = new CompositeKeyMap(["row", "column"]);
map.set({
  row: 3,
  column: 4
}, "Row 3, Column 4");

I took it one step further, and added a shortcut, the PartialKey.  With a PartialKey, I wouldn’t have to specify every field, every time:

var map = new CompositeKeyMap(["row", "column", "floor"]);
var floorKey = map.getPartialKey({floor: 6});
floorKey.set({
  row: 3,
  column: 4
}, "Row 3, Column 4");
map.get({row: 3, column: 4, floor: 6}) // returns "Row 3, Column 4"

All of these you can see on my main web site under libraries/CompositeKeyMap, and with a Jasmine test suite.

Should you use these?

If you don’t intend to require Mozilla Firefox 6+, probably not.  This code won’t work without that, and there are no fallbacks.

If you want to use partial keys, I would think the CompositeKeyMap is very useful indeed.  I’d recommend one key you specify be for objects only, at least:  otherwise you might as well just use an ordinary JavaScript object as your map: {}.  Whether it should be the first or the last for memory efficiency, I can’t tell you.

I don’t see much use for WeakMapWithPrimitives, to be honest.  I did that as a proof-of-concept only, a stepping stone to the CompositeKeyMap.

Thanks for reading – and feel free to play with it or review the code as Mozilla Firefox 6 launches next week.  Comments welcome!

Welcome to my new digital home. Watch out for falling bytes.

Well, it was time.  I finally decided to launch my own web site, again.  Truth be told, I didn't even have a website all these years that I had a blog.  I didn't feel I really needed one.  Many thanks to the MozillaZine crew for hosting me all these years.

So why'd I move?  There's a few reasons:

  • I needed a home for JavaScript demos, and free blogs just don't cut it for that.  (Which is why I didn't end up at WordPress.com.)
  • I've been playing with WebGL a bit, and HTML 5 Audio.  Fifteen seconds of Ogg Vorbis audio is around 200 KB, even for just speech! When I want to do five minute shows, that's a non-starter on a free blog.
  • To be honest, I've been a bit of a cheapskate, too. The free ride had to end sometime.

I've also gone ahead and purchased HTTPS hosting. Why? Admittedly, it's overkill. But at the same time, I want you to know that the files on this site are coming from me. In particular, I want people who don't trust the Web (NoScript, anyone?) to trust me. This really is just a personal site for me, to showcase my public works of Web technology.

Yes, Verbosio, my prototype XML editor, is the main focus of those works. I'm considering moving it over here, as it never really was a good fit with MozDevGroup's aim of supporting extensions. (Mine's a full-fledged vaporware application.)

Speaking of WebGL, I've got a cheap prototype demonstration of circles in 3-D space going. Don't ask me why the circles are a little flattened. I'm hoping someone out there can tell me why. It's all leading towards a series of presentations I'm calling “Visualizing the DOM”, and the first installment's coming soon (probably in a month or less).

The college journey begins…

I’m taking college classes at Chabot College, for an Associate’s in Science degree (major: Computer Science, emphasis in Mathematics). My online classes start today. 🙂

Most people I’ve worked with were surprised when I told them I have no college education whatsoever. Oh, sure, I attended an A-School in the U.S. Navy to be a Journalist, but I don’t think that counts. I’m 33 years old, and I spent the last decade building up my resume. Now, it’s time.

Chabot recommends a person working full-time take no more than 6 units per semester. I’m starting with Real Estate Principles (3 units) and Sports Officiating (2 units) for the summer, then Math 1 (Calculus, 5 units) and Volleyball Beginners (1 unit) in the fall. If I think I can handle more, I’ll take more. If I think it’s too much, I’ll take less.

I put this off for a long, long time. I remember an incident shortly after my book was published in 2002: I was attending the OSCON, to promote the book. An employee of Amazon.com talked to me and said he wanted to hire me. I was more than willing. Then via e-mail he saw my resume had no degree on it, and he said he could not hire me. I’ve always remembered that, with sadness and a hint of bitterness. I was qualified to do the work then (at least, I thought so), and I would have enjoyed it. Alas, it was not to be. One of the reasons I’m going to college is to correct that hole in my resume, to show that I know what the hell I’m doing.

What’s this mean for my other projects? Well, work still takes first priority, even over college. But my pet projects like Verbosio (the prototype extensible XML editor) are probably going to be back-burnered again.

I’m not sure what I can do about Venkman. I’ve had five different people write me on my blog expressing interest in working on my proposed rewrite, and I think that’s great. I’d still be willing to mentor, but it needs a leader who’s willing to get his or her hands dirty and dive in. Last time I cited a need for dockable XUL panels. We need someone to step up, to create visual “mocks” of how they perceive this. Then we need to write some code.

Finally, I know I haven’t been very active in the Mozilla community lately. College means I’ll be even less active, and that does make me sad. I wish my own pet project was ready for others to play in, but it’s not. I wish I had the time to contribute to ongoing DOM Core or developer tools work, but I don’t.

I’ve said it before, and I’ll say it again: this community and platform has given me a career, and I am eternally grateful for that. If I ever do work for Mozilla, maybe my business card should say “Lizard bridge builder” or something like that. Because you guys have built bridges for me, so far, and I’m not giving up yet.

Venkman thoughts 2011, part 2

First of all, thanks to everyone who’s responded so far to my statement about Venkman dying yesterday. I’ve had a few thoughts and a few communications since then, and I thought I’d try to answer them here.

Venkman versus Firebug

The most important question I’ve gotten so far is “Well, what does Venkman have that Firebug doesn’t?” As I said in comments, I don’t know, because I have almost never used Firebug. Apparently, several years ago, a few people talked about it (search results courtesy of Google.com). For example:

Since my blog isn’t really a good place to collect this data, I figured I’d start a comparison wiki page where we can collect the features of each. Firebug and Venkman fans, please, help me out with some facts – log in and write them down!

On another note, the question itself bothers me a bit. Eight years ago, you could ask “What does Macintosh do that Windows doesn’t?” We were in a monoculture back then (and still are). You could ask “What does Mozilla do that Internet Explorer doesn’t?” about eight years ago, too. Again, a monoculture existed then.

I agree, Firebug is a very impressive tool, even if I haven’t used it very much. (Something about it’s good, if so many people use it and support it regularly.) Also, remember Firebug itself came years after Venkman… and JavaScript debugging was a monoculture then too. Firebug had a compelling answer then. Venkman, having languished in the shadows for years, doesn’t really have a compelling answer now, but that’s beside the point.

When you have at least two complete, healthy projects using the same interfaces, you’re probably doing something right. The W3C works like this: few specifications reach Proposed Recommendation status without two independent complete implementations. The spec may have bugs, and the implementations certainly will… but it provides a level of confidence that the spec is usable.

Now, someone might write a JS debugger UI independent of both Firebug and Venkman, using jsd2… and that’s great. The question bothers me, and I thought I was answering it above… I can’t put my finger on what it is that bothers me right now, but it’s a gut feeling.

The Venkman community: diehards

The second thing I notice from replies so far is that there are a few enthusiasts still out there. 🙂 It’s nice to know, and it’s appreciated. No one’s committed to working on a rewrite yet (not surprising – it’s a huge task). I certainly haven’t figured out high-level details of a rewrite project yet. My goal yesterday was to start the conversation, but to move on, I need somewhere I and others can at least white-board a bit.

I don’t even have a viable code name for the rewrite yet. (The best I’ve come up with so far is “Spengler”.) I’m open to suggestions – maybe WikiMo, maybe somewhere else.

The only thing I know we need and don’t have right now is a good “dockable XUL panels” implementation. Neil Deakin filed bug 554926 for this. This is not what I would call a “good first bug” by any means, but I suspect a lot of editor applications would love having this. (Komodo Edit, my Verbosio project, BlueGriffon, etc.) I envision using XUL panel elements to replace the floating windows Venkman currently provides. Panels in general could use a lot of help – see XUL:Panel_Improvements for details. I’m sure Neil would welcome patches.

Next steps

I don’t know yet. It’s too soon for me to call anything like a “Town Hall” for Venkman replacement efforts. I’m still trying to identify people willing to actively contribute time and talent. If it were me and Gijs alone, forget about it arriving in the next three years. We need help if it’s going to get done.

R.I.P., Venkman

Almost four years ago, I tried to rally an effort to keep Venkman, ye olde JavaScript debugger, alive. It has been on life support since then. The only people currently working on the code are Gijs Kruitbosch and myself, to my knowledge. For all intents and purposes, I believe the two of us are the current “owners” of Venkman, by default.

Soon, the new “JSD2” JavaScript Debugger interface code will land on mozilla-central. (By “soon”, I mean probably by the end of the year.) When Firebug moves to JSD2, JSD1 (which Venkman relies on) will be deprecated and eventually removed. This would be the final nail in the coffin for Venkman, and it’ll be time to bury it.

Now, I need a Venkman-like tool regardless, and the UI as it was presented to the end-user was fairly well defined. The problems I had were really about how to make improvements on an architecture that’s over ten years old and has been abandonware for years. When
I need something and no one else is building it, I’m likely to build it.
So I’d like to start a new project that looks like Venkman, but works with jsd2 and has a clean, truly XUL-based UI implementation.

The biggest problem we face, by far, is a lack of available developer time. I have a full-time job and I’m about to start college (more on that in a separate blog post). Not to mention a little pet project that I’m obsessed about. To pull this off, we’re going to need some help, particularly from competent JavaScript authors. Previous experience in Mozilla UI hacking not required – I’ll be very happy to teach XUL & XBL to anyone who would offer significant help.

I’m looking for volunteers to help me kick-start a new Mozilla Debugger Developer Community. Who out there is interested?

(P.S. The company I work for is looking to hire developers who are familiar with FF extensions. For anyone who’s not experienced enough, a project like this is a great way to get into the field… )

Make: It isn’t just for files

This weekend, I took a special break and headed over to the Maker Faire at the San Mateo County Event Center. In some ways, it was both entertaining and shocking (in a good way).

Highlights for me:

  • Meeting David Brin and getting his autograph on “Foundation’s Triumph”
  • Mike Rowe and Adam Savage taking questions (that they’ve probably answered a couple hundred times already)
  • Another musical artist, with the “Slaperoo” (warning: site plays sound right away on load)
  • The incredible amount and quality of stuff being built by hobbyists these days
  • A little story about who exactly made the Apollo space suits (it’s not who you might think!). (The site is down right now.)
  • Eepybird. Nuff said, except that they’ve switched to Coke Zero.
  • The large variety of very cool tee-shirts. I’ll be ordering another one very soon, based on a card I picked up earlier today.

About the only thing missing is an appearance by the Blue Man Group, in my opinion. They would’ve found some neat instruments to add to their collection. Oh, yeah, they’re coming to San Francisco, starting Tuesday. 🙂 I’d love to have some fellow Mozillians join me for that one. Anyone interested, we’ll do a get-together and buy some group tickets.

Alex Vincent's ramblings about Mozilla technology, authoring, and whatever he feels like.