Category Archives: Uncategorized

Associate’s Degree in Computer Science (Emphasis in Mathematics)

Hi, all.  I know I’ve been really quiet lately, because I’ve been really busy.  My fulltime job is continuing along well, and I just completed a Associate’s in Arts degree, majoring in Computer Science with an emphasis in Mathematics at Chabot College.

I have an online music course to take to complete my lower division education requirements, and then I’ll be starting in the fall quarter at California State University, East Bay on a Bachelor’s of Science degree, also majoring in Computer Science.

No, I don’t have any witty pearls of wisdom to offer in speeches, so I will defer to the expert in commencement speeches, Baz Luhrmann:

My two cents on WebExtensions, XPCOM/XUL and other announcements

(tl;dr:  There’s a lot going on, and I have some sage, if painful, advice for those who think Mozilla is just ruining your ability to do what you do.  But this advice is worth exactly what you pay to read it.  If you don’t care about a deeper discussion, just move to the next article.)

 

The last few weeks on Planet Mozilla have had some interesting moments:  great, good, bad, and ugly.  Honestly, all the recent traffic has impacts on me professionally, both present and future, so I’m going to respond very cautiously here.  Please forgive the piling on – and understand that I’m not entirely opposed to the most controversial piece.

  • WebAssembly.  It just so happens I’m taking an assembly language course now at Chabot College.  So I want to hear more about this.  I don’t think anyone’s going to complain much about faster JavaScript execution… until someone finds a way to break out of the .wasm sandboxing, of course.  I really want to be a part of that.
  • ECMAScript 6th Edition versus the current Web:  I’m looking forward to Christian Heilmann’s revised thoughts on the subject.  On my pet projects, I find the new features of ECMAScript 6 gloriously fun to use, and I hate working with JS that doesn’t fully support it.  (CoffeeScript, are you listening?)
  • WebDriver:  Professionally I have a very high interest in this.  I think three of the companies I’ve worked for, including FileThis (my current employer), could benefit from participating in the development of the WebDriver spec.  I need to get involved in this.
  • Electrolysis:  I think in general it’s a good thing.  Right now when one webpage misbehaves, it can affect the whole Firefox instance that’s running.
  • Scripts as modules:  I love .jsm’s, and I see in relevant bugs that some consensus on ECMAScript 6-based modules is starting to really come together.  Long overdue, but there’s definitely traction, and it’s worth watching.
  • Pocket in Firefox:  I haven’t used it, and I’m not interested.  As for it being a “surprise”:  I’ll come back to that in a moment.
  • Rust and Servo:  Congratulations on Rust reaching 1.0 – that’s a pretty big milestone.  I haven’t had enough time to take a deep look at it.  Ditto Servo.  It must be nice having a team dedicated to researching and developing new ideas like this, without a specific business goal.  I’m envious.  🙂
  • Developer Tools:  My apologies for nagging too much about one particular bug that really hurts us at FileThis, but I do understand there’s a lot of other important work to be done.  If I understood how the devtools protocols worked, I could try to fix the bug myself.  I wish I could have a live video chat with the right people there, or some reference OGG videos, to help out… but videos would quickly become obsolete documentation.
  • WebExtensions, XPCOM and XULUh oh.

First of all, I’m more focused on running custom XUL apps via firefox -app than I am on extensions to base-line Firefox.  I read the announcement about this very, very carefully.  I note that there was no mention of XUL applications being affected, only XUL-based add-ons.  The headline said “Deprecration of XUL, XPCOM…” but the text makes it clear that this applies mostly to add-ons.  So for the moment, I can live with it.

Mozilla’s staff has been sending mixed messages, though.  On the one hand, we’re finally getting a Firefox-based SDK into regular production. (Sorry, guys, I really wish I could have driven that to completion.)  On the other, XUL development itself is considered dead – no new features will be added to the language, as I found to my dismay when a XUL tree bug I’d been interested in was WONTFIX’ed.  Ditto XBL, and possibly XPCOM itself.  In other words, what I’ve specialized in for the last dozen years is becoming obsolete knowledge.

I mean, I get it:  the Web has to evolve, and so do the user-agents (note I didn’t say “browsers”, deliberately) that deliver it to human beings have to evolve too.  It’s a brutal Darwinian process of not just technologies, but ideas:  what works, spreads – and what’s hard for average people (or developers) to work with, dies off.

But here’s the thing:  Mozilla, Google, Microsoft, and Opera all have huge customer bases to serve with their browser products, and their customer bases aren’t necessarily the same as yours or mine (other developers, other businesses).  In one sense we should be grateful that all these ideas are being tried out.  In another, it’s really hard for third-parties like FileThis or TenFourFox or NoScript or Disruptive Innovations, who have much less resources and different business goals, to keep up with that brutally fast Darwinian pace these major companies have set for themselves.  (They say it’s for their customers, and they’re probably right, but we’re coughing on the dust trails they kick up.)  Switching to an “extended support release” branch only gives you a longer stability cycle… for a while, anyway, and then you’re back in catch-up mode.

A browser for the World Wide Web is a complex beast to build and maintain, and growing more so every year.  That’s because in the mad scramble to provide better services for Web end-users, they add new technologies and new ideas rapidly, but they also retire “undesirable” technologies.  Maybe not so rapidly – I do feel sympathy for those who complain about CSS prefixes being abused in the wild, for example – but the core products of these browser providers do eventually move on from what, in their collective opinions, just isn’t worth supporting anymore.

So what do you do if you’re building a third-party product that relies on Mozilla Firefox supporting something that’s fallen out of favor?

Well, obviously, the first thing you do is complain on your weblog that gets syndicated to Planet Mozilla.  That’s what I’m doing, isn’t it?  🙂

Ultimately, though, you have to own the code.  I’m going to speak very carefully here.

In economic terms, we web developers deal with an oligopoly of web browser vendors:  a very small but dominant set of players in the web browsing “market”.  They spend vast resources building, maintaining and supporting their products and largely give them away for free.  In theory the barriers to entry are small, especially for Webkit-based browsers and Gecko:  download the source, customize it, build and deploy.

In practice… maintenance of these products is extremely difficult.  If there’s a bug in NSS or the browser devtools, I’m not the best person to fix it.  But I’m the Mozilla expert where I work, and usually have been.

I think it isn’t a stretch to say that web browsers, because of the sheer number of features needed to satisfy the average end-user, rapidly approach the complexity of a full-blown operating system.  That’s right:  Firefox is your operating system for accessing the Web.  Or Chrome is.  Or Opera, or Safari.  It’s not just HTML, CSS and JavaScript anymore:  it’s audio, video, security, debuggers, automatic updates, add-ons that are mini-programs in their own right, canvases, multithreading, just-in-time compilation, support for mobile devices, animations, et cetera.  Plus the standards, which are also evolving at high frequencies.

My point in all this is as I said above:  we third party developers have to own the code, even code bases far too large for us to properly own anymore.  What do I mean by ownership?  Some would say, “deal with it as best you can”.  Some would say, “Oh yeah? Fork you!”  Someone truly crazy (me) would say, “consider what it would take to build your own.”

I mean that.  Really.  I don’t mean “build your own.”  I mean, “consider what you would require to do this independently of the big browser vendors.”

If that thought – building something that fits your needs and is complex enough to satisfy your audience of web end-users, who are accustomed to what Mozilla Firefox or Google Chrome or Microsoft Edge, etc., provide them already, complete with back-end support infrastructure to make it seamlessly work 99.999% of the time – scares you, then congratulations:  you’re aware of your limited lifespan and time available to spend on such a project.

For what it’s worth, I am considering such an idea.  For the future, when it comes time to build my own company around my own ideas.  That idea scares the heck out of me.  But I’m still thinking about it.

Just like reading this article, when it comes to building your products, you get what you pay for.  Or more accurately, you only own what you’re paying for.  The rest of it… that’s a side effect of the business or industry you’re working in, and you’re not in control of these external factors you subconsciously rely on.

Bottom line:  browser vendors are out to serve their customer bases, which are tens of millions, if not hundreds of millions of people in size.  How much of the code, of the product, that you are complaining about do you truly own?  How much of it do you understand and can support on your own?  The chances are, you’re relying on benevolent dictators in this oligopoly of web browsers.

It’s not a bad thing, except when their interests don’t align with yours as a developer.  Then it’s merely an inconvenience… for you.  How much of an inconvenience?  Only you can determine that.

Then you can write a long diatribe for Planet Mozilla about how much this hurts you.

I bought a condominium!

It seems customary on Planet Mozilla to announce major positive events in life.  Well, I’ve just had one.  Not quite as major as “I’m a new dad”, but it’s up there.  With the help of my former employers who paid my salaries (and one successful startup, Skyfire), I closed a deal on a condominium on March 5, in Hayward, California, U.S.A.

There will be a housewarming party.  Current and former Mozillians are certainly welcome to drop by.  The date is TBA, and parking will be extremely limited, so a RSVP will be required.

I’ll write a new post with the details when I have them.

Think about what you do and to who you do it.

Think, for a moment, about the kind of man you throw out of office.   We’re talking about a revolutionary here.  A man who changed the way we live and work entirely.  A man who did everything he could to promote independence, writing missives that people listened to.

Yes, he had his faults.  What man doesn’t?  But he was a leader before he was the top dog, and he did very well as a leader.  He has laid his thumbprint on history with his works.  He put his heart and soul, and his reputation, on the line, day in and day out.

You might think I’m writing the above about Brendan Eich, and I am.  But consider this:  the same could be said for the third President of the United States of America.

Congratulations.  By the same rational thinking, we the people just threw Thomas Jefferson out of office because he once owned slaves.

Promises: So We Rewire it!

We’ve been doing asynchronous code all wrong

The more I learn, the more I realize that old ways of doing things just aren’t capable enough.  For instance, JavaScript developers are taught to write asynchronous code using callback functions:

function asyncDriver(callback) {
    controller.goDoSomething(var1, var2, function whenDone(result) {
        controller.doSomethingElse(var3, function whenThatIsDone(result) {
            controller.etc(var4, callback);
        });
    });
}

This code is ugly, and hard to think about.  Wouldn’t it be nice to write:

function asyncDriver(callback) {
    var p = Task.spawn(function() {
        yield controller.goDoSomething(var1, var2);
        yield controller.doSomethingElse(var3);
        let result = controller.etc(var4);
        throw new Task.Result(result);
    });
    p.then(callback, reportError);
}

Why, yes, it would!  This is what Promises and Task.jsm bring to us.  It makes really messy code easy to read again.  The yield statements here force the function to pause in the middle of its operation, under the assumption that each value yielded is a Promise object.  When the promise “resolves”, the function continues with the promise’s resolved value.  A sequence of nested asynchronous functions becomes one function which looks synchronous (but is really still asynchronous).

This was what I used to write a JSONStore module for addons to use as a replacement for preferences – a way to store data and settings, and get them back when you need them.  There are some bugs, and this is just one of many ideas being floated for addon settings.  Discussion on the new module is ongoing in bug 952304 – it’s not part of Mozilla yet and won’t be for some time.  Use at your own risk.

But that’s not the whole point of this post.  I’m going a step further.

Promises, meet transactions

Promises are great when dealing with asynchronous operations… but if they fail and you want to roll back what has already happened, what do you do?  Being able to undo an operation is pretty important in a few environments, especially editing multiple files at once.

Now I like Mozilla’s native transaction manager API.  It works well for what it was designed for.  But it wasn’t designed for asynchronous operations.  Nor can I use the transaction manager in a chrome worker thread, because I can’t access XPCOM from chrome workers.  (There’s good reason for that, but it’s really unfortunate in this case.)

The transaction manager API has a couple other flaws:

  • Transaction objects don’t have any method for getting a human-readable description of what actually happened in the transaction.
  • If a transaction listener vetoes something, there’s no way for other transaction listeners which had already approved an operation to find out it was vetoed.
  • The transaction manager has limited ways of indicating its current state when things go wrong, not just in performing an operation, but in rolling the operation back.
  • If you’re dealing with multiple kinds of editing (DOM operations, source code changes), there’s really no good way to coordinate those.

As Tim Allen would say, “No power.  So I rewired it!”

I don’t have answers for these issues yet.  I think I’ll have to implement my own transaction manager API to improve upon all of the above.  But a design like that isn’t necessarily easy… I’d love to have help, especially if you think you might need something like this yourself.

One thing’s for sure:  when this new API and implementation is ready, it’ll have a lot of tests to go with it.  So, lend me your thoughts in the comments, please!

Enjoy the silence? Not so much.

For a long time, I’ve been wondering why no one besides spammers was responding to my blog.  After all, comments are one of the main features of a blog, and without it, a blog is just a bully pulpit.  Finally someone got through to me and let me know that comments were broken on my blog entirely.

*sound of a headdesk echoing across the Internet*

Wow.  Just, wow.  I had no idea.  I’m very sorry about that… and I can say that they’re fixed now.  (Thanks a lot, old silently busted WordPress theme!)

According to my admin panel, I haven’t had any comments on this blog in nearly two years.  So, if you’re curious about XUL editing or my Verbosio XML editor project, or about anything I usually ramble about, please take a few minutes to read over my past couple years of posts and drop a line.

Should I switch to GitHub? Should I mirror?

I’m in a dilemma.  Enough Mozilla community members have asked me, “why don’t you switch to GitHub?  There’s a larger JS community over there.”  I can’t exactly ignore that, considering that I really do need help.  And yet… I like SourceForge.  But not enough to be alone in the desert.

What do you think, Mozillians?  I know you’re busy, but how strongly do you prefer GitHub or SourceForge over the other?

UPDATE:  I just discovered comments were broken… fixed.

Verbosio progress, April 4, 2013: Return of the screenshots

With college and a full-time job, I’ve been rather busy.  But I still find time to make small improvements.  I’m continuing to work on my experimental XML editor, Verbosio, from the ground level.  I realized a couple months ago that I really needed some pictures to show where I was at.

This is work from my Verbosio Templates subproject, which currently requires Mozilla Firefox 22 (currently in Aurora).

The first image looks like a very cheap copy of DOM Inspector – and it is.

A XML document generated from DOMParser
A XML document generated from DOMParser

The second image is little better, until you realize that the entire DOM implementation is entirely in JavaScript.

A DOM-like tree constructed from Simple API for XML (SAX) and pure JavaScript
A DOM-like tree constructed from Simple API for XML (SAX) and pure JavaScript

There are a couple more differences – for instance, this latter view includes a XML declaration.  In the DOM specification, it shouldn’t be there, because the XML declaration is not recognized as a node at all.  But if you’re going to edit a document that already exists and already has that declaration, as a set of DOM nodes, you should preserve that declaration.  So my non-standard DOM implementation makes it a processing instruction.

Both trees, by the way, are also using my TreeViews.jsm module.  I hope this module can be helpful in building XUL trees from object models like the DOM.

I’m also starting work on a WebGL-based view of the same trees, where nodes and their descendants are laid out in two dimensions as cubes, in a tree hierarchy like this.  WebGL is a three-dimensional context, though, so you may be wondering what the third dimension is for.  Initially, I’ll draw attributes as cubes above the elements they belong to.  That part shouldn’t be hard – and it should be very doable by anyone proficient with WebGL, inside an ordinary web page.  I’d love it if someone beat me to it.

I have other things to show through that third dimension, so stay tuned.  I’ll get back to preserving source formatting soon enough.

DimensionalMap: Accelerating the heat-death of the Universe

I thought I was being clever with my DimensionalMap subproject:  bootstrapping on top of the Map object to build a multi-dimensional hashtable.  Then I started using it heavily in my Verbosio Templates subproject to implement a very customized Document Object Model in JavaScript.  With over 900 Jasmine tests passing, execution time was averaging about 4 seconds.

However, I suspected my use of DimensionalMap was unnecessary – particularly since I was working with only one dimension at the time.  So in a quick experiment, I replaced the code that used it with code using WeakMap.  Total execution time with the same tests:  about 1.5 seconds.

Ouch.  I still think the API is a good one, but clearly the implementation has issues.

Hey, kids!  Wanna build your JavaScript profiling skills?  Help me find the spots that are causing unnecessary slowness!  I’ll be happy to write a letter of appreciation for anyone who wants to take it on.  As always, the code is available under Mozilla Public License 2.0, GPL 3 and LGPL 3.  (Childhood not required for participation.)