Skip to: Navigation | Content | Sidebar | Footer


Full Archives

Colour Bland: Value

September 29

This is the first in a series. View next.

Today was a sick day thanks to some bad chicken wings last night, so sticking with the theme, let's bring up the subject of colour. Yes, I said colour with a 'U': that's the Canadian (and [I'm told] British) spelling of the word (those followers).

Sparked by a conversation with the ever-understated and humble Joe Clark (in the flesh, no less, due to the recent ATypI conference taking place not 15 blocks from my doorstep), and continued by a thread over at Jon Hicks' place, what has been coming up frequently is talk about colour-blindness. Specifically that working with colour requires you consider the percentage of the population who have troubles viewing said colour.

Firstly though, run, do not walk, over to Pixy's Colour Scheme Picker application. Bookmark it, use it, and love it in every which way possible. This is the same Pixy that brought us the wildly ingenious CSS image hovers with preload... The Czech web design scene is flourishing, so say my site's referrers, and I can't wait to see what else Pixy brings to light for us Anglophones. (thanks to Nick for bringing this to my attention on the Digital Web What's New board)

Now that you're armed, here's what you need to know. Colour deficiencies are varied and specific to the viewer, so although Pixy and others have come up with mechanical ways to simulate the deficiencies, these can only be used as an approximation and not a final word. Additionally, converting a design to monochrome isn't an indicator of what someone without colour abilities will view.

But there are certain truths in working with colour that can be exploited for your benefit. These are what we're interested in.

For maximum legibility, black text on a white background is the holy grail of on-screen design. This is not to say that every single instance of every single design entails using solely black on white. In fact, to some people (especially those on LCD monitors, which have amazingly high contrast ratios to begin with) this can prove too much contrast. But the basic principle is a starting point: dark text on light background is more legible than light text on dark background, and contrast between the text and what it sits on determines how easy it is to read.

Now consider colour. If you use coloured text on a white background, you are moving away from the ideal. This in itself isn't a problem, but ignoring deficiencies for a second, a wide range of tonal value exists within the visible spectrum. Value is a hard concept to grasp for non-designers, so first the technical description, followed by an illustration. Art Fundamentals: Theory and Practice (publisher McGraw-Hill, 1998) defines it this way:

The property of color known as value distinguishes between the lightness and darkness of colors, or the quantity of light a color reflects.

Simply put, some colours appear darker than others. Purple is a dark colour, red is middle of the road, and yellow is light. Figures 1.1 and 1.3 attempt to demonstrate the difference in value between the colours shown:

Figures 1.1 through 1.3 showing supporting colour swatches

Using this knowledge, it's not a stretch to see how using yellow text on a white background is ridiculously low in contrast, but using purple text on white could work.

What about the person who has trouble with purple though? There are many schools of thinking on this, and no definitive answers right now. Many who have trouble viewing purple and blue see the two colours as the same; the value of the colour may persist, despite confusion over the precise hue.

But we can see that converting our work to greyscale isn't an accurate test. Consider Figure 1.2—the same colours shown in Figure 1.1 have been mechanically converted to monochrome, and come out an identical shade of grey. Photoshop sees the colours as being equal in value, but our eyes do not. Photoshop is value-blind, so we can't accurately test work this way.

Note: Douglas Bowman has pointed out that other methods of converting to greyscale produce more accurate greyscale values. I'm aware of at least three different conversion methods, all of them producing different results. Keep in mind that none of these are meant to be simulations of colour-blindness either, they're merely software algorithms to represent values. Nothing about an algorithm can simulate the human eye, they can only ever be approximations.

Continued with Colour Bland: Contrast.

More Reading:

Permalink › | 15 comments

Notes From All Over

September 26

On CSS redesigns, ISSN, semantics, and FIR revisited.

A note from Paul Hammond to those who would rebuild existing sites in CSS: CSS can do so much more than tables ever could, and your work is showing none of this. In fact, it's hinting at the opposite, that CSS doesn't give us any new toys to play with and people can probably get away without learning it, as long as nobody looks at the source. Read the rest . Thought-provoking, and touches on a very good point: consider your etiquette. No one likes to be told they're wrong. § Requisite link time: Joe Clark dismisses the increasing rejection of weblog owners who apply for ISSNs. You may remember an article on mezzoblue a few months back relating my own problems in obtaining one for this site. First-hand experience on this matter suggests Joe is entirely correct in both his assertions and conclusions. (via Zeldman ) § Marking up a title . This is a simple book title, folks, it should be open-and-shut. That it's not (and don't get me wrong: it isn't) suggests a big hole in general understanding of what is so semantic about semantic XHTML. I'd argue that true semantics require rigid definitions, leaving little room for grey area. But then I realize I don't know the first thing about the Semantic Web, and browsing through the source is far from enlightening: RDF is covered every which way, but I haven't seen a single mention of XHTML. Do we have an ' Elements of Style ' for semantic XHTML yet? Will one exist? Does XHTML even matter in the context of a semantic web, or is RDF the ticket? These are questions. I don't know the answers. § In bringing back online some of the still-missing content last night, I added this roundup of alternative FIR methods that came to light around the time I published my Digital Web article . I never publicized it, because it seems there really isn't a perfect alternative yet. Phark is nice code-wise, but doesn't work in IE5 and doesn't solve the images off, CSS on problem. Levin/Gilder is code-heavy and rather ugly, but does solve the accessibility issues... though it introduces a new problem with transparent GIFs. We haven't cracked this one yet, but there has to be a perfect solution out there somewhere... § ]]>

Permalink › | 14 comments

Eolas: Black or White?

September 25

Questioning Mike Doyle's motives.

Originally commented on Todd's thoughtful piece on what Mike Doyle of Eolas should do with his recent windfall from Microsoft: With this patent, and the resulting half billion, Doyle has a hell of a power over today's web. He can make or break every browser on the market. There's no indication yet whether he's a good guy or not. He's given the web community no sign of good faith. And until that day comes, his ability to destroy Microsoft's competition remains an overwhelming shadow. I'd absolutely love it if he turned out to be a friend. If this is David and Goliath, he needs to throw the first stone. Doyle, it's your move. What'll it be? I needn't duplicate what has been said elsewhere , but this patent issue is huge, and could potentially change the landscape of the web. We're quick to assume the worst, but could Doyle be the biggest friend Mozilla, Opera, and Apple have right now? The developers and the competing entities out there to Microsoft, I think, will come to find that we are more of a friend than a threat. (Mike Doyle, Eolas) Could there be a glimmer of hope with quotes like this? I want to believe. ]]>

Permalink › | 19 comments

The Real World

September 24

Looking for some well crafted, beautifully designed CSS-based sites for inspiration? Look no further than the portfolio of local Vancouver boys twothirty media.

Highlights:

You'll note, as you start poking through the source code, that most of these contain the odd table or two. You'll also notice that most come close to validation, but don't quite succeed.

These are real-world examples. Waxing theoretical about the benefits of pure and valid markup is fine, but when crunch time comes this is almost always the reality. Wired and ESPN do not validate. Cingular was never perfect, and is quite a mess now.

What can be said for the tables? Transitional layouts are still necessary today, given project requirements. Some visual effects cannot work reliably between the major browsers. Others cannot be done without CSS3. It's fun to think of a day when all browsers everywhere will handle every layout exactly the same. It's also fun to think of a day when we'll have flying cars and full meals in convenient pill format.

As has been noted elsewhere, we the people who are doing this for a paycheque face reality, not theory. When push comes to shove, we make the choices that work. This is not always consistent with the "right" choice.

Commercial web design will continue to be about compromise. Nine times out of ten, the more effective use of the client's dollar goes toward building and refining content over validating every last tag. Not to say that the two are incompatible, but between spending $500 on purifying their source or creating additional ads for off-site promotion, guess which most clients perceive as the most value for that money?

This is no excuse not to strive for the end goal of a compliant site. But it is a polite reminder to those who would find fault in others who, through factors they have no control over, can only go 95% of the way.

Theory is nice, in theory. But providing the best solution involves knowing that sometimes, it's okay to break the rules.

Permalink › | 21 comments

Some Notes

September 23

Switching your base technology, and some pitfalls to avoid.

When you search for a very specific piece of information you don't yet know, and your own damned site is the first that pops up in the results, you will groan.

When you check the link because you can't resist and actually find the very answer you're looking for on your own damned site, you will groan louder.

PHP is a snap, except when it isn't. See for example the recommended reading list every fourth refresh or so. What. Ever.

OS X use continues to be joyful. Building a PHP-driven site one weekend on a Mac, which was formerly an ASP-driven site on a PC makes one feel gleefully subversive.

.htaccess files are a must. If you don't have them, move to a server that will let you. Do it, and don't look back.

Using an old Movable Type installation to generate an .htaccess file when completely re-mapping your URL archiving scheme is genius, and very necessary.

MySQL is probably great, and wonderful, and hoorah. Don't try to build a MySQL-driven site in a single day when you haven't yet learned a thing about it.

Don't ever try to move a Movable Type install without first exporting all your entries through the interface and saving your templates. Trust me: never.

Hosting your site on a static IP will save your life one day. Do this.

If you switch hosts, don't stop paying the former until you are sure everything is moved over to the latter. Burning bridges is not advised.

Even if your broken links are only temporary, provide pages explaining why or you will spend more time responding to e-mail telling you that you have broken your links than you will fixing those links that you broke.

Most importantly: when you are extremely busy and your server goes AWOL and you just don't have the time to fix it properly, remember that it's not Armageddon and people will forgive you for a few days of not reading fresh content. They will. They're good people.

Permalink › | 13 comments

Still in Limbo

September 20

update: Just so everyone knows, what you're currently looking at is in fact the new server, with full archives. I pulled the DNS swap (I didn't find out about modifying local host files until too late) and managed to export my entries. I'm running Berkely DB, just to clear that up—since I was re-installing anyway, I started fooling around with MySQL, but I'm just way too busy to wrap my mind around it right now. Maybe when things settle down.

Disappearing comments are thanks to the server switch. The last 10 to 15 comments are out of sync between the two, although it seems most are seeing the new server now. I'll lose a few in the process, but oh well.

There are still a few bits and pieces on this site that 404, but I'll be .htaccessing everything that I can, and fixing links where I can't. Links to previous posts should still work. More problem-solving once propogation hits my neck of the woods.


So here's the latest. Apparently you can't, in fact, drop an entire Movable Type install on a new server and expect things to just work out. In fact, it looks like even though I have my various databases backed up in full, multiple times, I have no way of actually getting that data back into Movable Type. I didn't try the proper internal 'export entries' method from MT before the DNS switch, and now I can't because the name server resolves here.

What to do? Well, for a while anyway I'll be jumping back to the other server. Once it propagates out my way, I'll try a proper export then hop back here.

Since DNS switches are anything but instantaneous, this will take another few days. I'm starting to feel like I'm running around in circles here.

On the bright side, most of the static content on this site has been converted to PHP, so (hopefully) the process of rebuilding the weblog data and re-pointing old archive links to the new equivalents will be quick. Once I have that data. If I can get it.

Any suggestions? I'm all ears.

Permalink › | 11 comments

Rollovers in IE

September 19

To: anyone finding that image-based CSS rollovers 'blink' in Internet Explorer, traditional or Pixy method:

Tools > Internet Options > General > Temporary Internet Files > Settings

Move the radio button from 'Every visit to the page' to 'Automatically'. Remember? You turned that on to clear up IE's caching problems? Most users don't, so you and a handful of other developers are the only one experiencing the blink. Rejoice.

From: someone who was in the same boat until a particularly frustrating evening of testing many moons ago.

Permalink › | 8 comments

Bumpy Ride

September 18

Folks, this is your captain speaking.

You may have noticed the turbulence. That's to be expected when we're flying this close to the ground, although we're doing our best up here in the cockpit to pick up the nose a smidge. The windshield's a little frosted over, but as best we can tell we're heading in the right direction. (If anyone knows what Cleveland looks like from the air, well golly we'd sure appreciate it if you could come up to the front and let us know.)

Flight attendents will be coming by shortly to pass out a light snack and some beverages. If you need a paper bag, they may have one or two left, but just, uh... try and hold it, yeah? We're running out of those in a real hurry.

We've shut off the inflight movie. Sorry about that, folks, we forgot to screen our new selections this month. Who knew that 'Alive' wouldn't have... uh... been as uplifting as one would have hoped. Gosh, you should see us. We sure are red right now.

Looks like we'll be clear in, oh, a few hours. Keep an eye on the ground, and let us know if we get too close. Thanks for flying with us this afternoon, and we sure hope you'll choose our airline for your next flight.


Sorry about all this, I'm still trying to get solid answers. Even my host's site has been up and down for the past two days, so I can't check the status of my claim.

It comes at a bad time, but my hand is forced: I'll be making the switch to PHP in the next few days, for better or for worse. Matt Mullenweg has stepped to the plate with an offer I just couldn't say no to, so a few days of madly patching my scripts to work in a non-Micrsoft environment will ensue.

It'll sting for a bit. But we'll all be happier for it. Thanks for your continued patience.

Permalink › | 26 comments

Downtime

September 17

Back in the saddle. Archive of today's temporary page follows, since there are bound to be questions and bits of advice.

!$%&@* § When the high-tech method fails, there are always low-tech alternatives. Thanks to an increasingly spotty host, I'm kicking it Zeldman-style today and hand-coding this in Notepad. It would seem that any files ending in .asp are timing out, and reporting strange things. So the easy alternative is an index.html in my root. Which is what you're viewing now. All this to say I'm still alive, I'm not going anywhere, and this will all be worked out shortly. Thanks for your patience. Keep checking the archives—when things are restored, you should be able to view the rest of the site. d. Since this page is ephemeral anyway § mezzoblue + Apple. The day has come. I finally took the plunge and picked up my first Mac this week. Ever. Here's a fun thing: prices are, as a rule, more in Canada. In this case, Apple charges a 50% premium on all its products. So for the low, low price of $2000, I just barely managed to buy the lowest-end iBook they currently ship. Fun! To be fair, I sprung for an extra 512MB of RAM that's included in that price. So even this comparatively underpowered piece piece of sexy, sexy plastic now has more juice than my existing system. Have I "switched"? Not a chance. Will I? Time will tell. Am I loving OS X? Why yes I am, thanks for asking. Places you can go § VeriBadSign Bowman on Verisign's Bad Move. Does Microsoft want to lose the plug-in patent case? Zeldman on Microsoft vs. . SimpleQuiz - Part VI - Form(atting) Cederholm's Super Semantic SimpleQuiz Successor. MTSETUP Alpha Testing Rubin's automating MovableType installs. Things of Interest; the Job Hunt Hire Mike Pick. CSS only mostly stupid. Marcotte fixes LIR. MacIE5 users rejoice. Dive Into Publishing Pilgrim's “Dive into Python” soon available on dead trees. ]]>

Permalink › | 10 comments

WaiZilla

September 16

Excellent! Tim Roberts has started hacking away at an open-source, cross-platform accessibility validator he's dubbed WaiZilla.

All testing is performed client-side, so results display in mere milliseconds instead of being delayed by a page refresh. Tim has set up waizilla.com for future downloads/documentation/etc.

He's looking for volunteers, anything you can contribute would be appreciated. More detail over at AccessifyForums. A worthwhile cause, good show Tim!

[via Accessify]

Permalink › | 1 comments

Copy-let's-get-this-Right

September 13

Freedom wins.

Point: there are people who will steal Zen Garden designs no matter what we do.

Point: slapping a restrictive license on the CSS files is completely against what the Zen Garden is trying to accomplish.

Point: cheap knock-offs are not a professional threat to the original designers.

Point: those who need to use other people's CSS are still learning, or they're lazy. If it's the former, their use is an intermediate step; when they feel comfortable enough to move beyond someone else's design, there's no question they will. If it's the latter, the site they're applying the design to won't be around for long anyway.

So there you go. Creative Commons is back (although it never technically went away, since I hadn't got around to modifying the .css files yet), and all new submissions will continue to be open.

I realized the value of the Zen Garden is much more than I originally intended: not only do we have a strong demonstration of what CSS design is all about, but we also have a central repository of tested CSS designs that have solved browser compatibility issues and layout problems. Little Boxes, Blue Robot, and Glish are all still useful and relevant, but we're moving beyond duplicating table-based layouts.

Creative Commons remains a double-edged sword. Individual use has never really bothered me (there have been plenty of people basing sites/weblogs on the .css files), but the recent example moved into a whole new territory I wasn't prepared to deal with, namely re-distribution.

I worry all this will inhibit some submitters. Design is not software: there is no such thing as an 'open source' mentality. Some share more freely than others, but most are very firmly negative about their work being used elsewhere. The quality of work being submitted keeps going up despite it all; maybe I needn't worry.

Plunder! Dissect! Pilfer! Learn! (Just don't steal the images.)

Permalink › | 24 comments

Accessibility Notes

September 12

Valid code = accessible sites? Problems with 'skip navigation' links, with solutions. Window Blinds training. Personal reflection.

Mike Pepper informed me this morning of a site he's been working on . Not only was this his first XHTML/CSS site ever , but upon running it through Bobby we discovered that without even initially trying, he flew through Section 508 compliance and passed AAA with only one minor, corrected glitch. When you are questioned on the benefits of valid markup, relate this story. Obligatory yadda yadda yadda: accessibility doesn't stop with Bobby or Cynthia , they're merely to be used as guides. As always, follow WCAG guidelines for everything the software validators can't check. But you knew that, right? § Bob Easton brought up a good point recently. For the same reasons that using display: none; makes classic FIR an accessibility headache, a few screenreaders won't pick up carefully crafted "Skip Navigation" links that are specifically designed for them . It's a bit of a pickle, but Jon Hicks to the rescue ! Although four lines of CSS is a bit messier than one, the following code solves the problem quite nicely: .skipLink { height: 0; width: 0; overflow: hidden; position: absolute; /* for the benefit of IE5 Mac */ } That's not the end of it though. In his now-famous book, Joe Clark argues for making the "Skip Navigation" link viewable to all users anyway. Look, no one ever said this web thing was going to be easy. § Speaking of Joe Clark, he forwarded an interesting note from a WAI mailing list earlier this week that those around the MA area might be interested in: GW Micro is pleased to announce that Amy Ruell of the Visually Impaired Blind Users Group and the Massachusetts Institute of Technology (MIT) are hosting Window-Eyes Basic and Intermediate Skills Training. The trainers will be GW Micro staff. The training will take place October 9-10, 2003 at the MIT Adaptive Technology (ATIC) Lab, 77 Mass. Ave, Cambridge MA. Window-Eyes is the second-ranked product on the screenreader market after JAWS, and this training session is for everyone. More information can be found on the wai-ig mailing list . § Yesterday marked the second anniversary of an immense tragedy. If I needed to excuse myself for commemorating it (the page I used was courtesy of Matt Haughey ), perhaps those complaining might think to re-examine their priorities. Thank you Jai in particular for helping re-affirm my belief in humanity. And after yesterday's silence, take another moment to mark the passing of the Johns Ritter and Cash . § ]]>

Permalink › | 18 comments

Copywrong Revisited

September 10

When is an orange not an orange? When someone colors it purple. I have learned more about copyrights and licensing in the past 24 hours than I ever thought I'd need to.

This post is about the general issue. The specific dispute has been resolved, as the person I was quoting has both privately and publicly apologized and settled this in a way acceptable to me and hopefully all the Zen Garden authors. The funny thing about repentence is that it places the accuser in a spot of feeling guilty for doing the accusing in the first place, but I digress, and we've moved on.

It would seem that Creative Commons is a tangle of worms, and the simple three-step process they offer when selecting a license is overly simplistic. Back in May when I evaluated CC and chose the license I did, I made sure to read the full legal code and didn't find anything that I couldn't agree with. Experience since has highlighted the problems with CC licenses in real world conditions which I never could have foreseen at the time:

  • 'Derivative Work' is an undefined term. Technically, someone could grab any work governed by this license, modify a single pixel, and re-release it as their own 'derivative work', and that's okay.
  • 'Attribution' is an undefined term. The same person is not obligated to credit the original author visibly; they could just bury their attribution in an obscure spot of the source file that no one but the most conscientious developer will ever see.
  • The license is definitive. You cannot add terms on top of it. This effectively means you lose any control of your copyright beyond what Creative Commons affords. This is what I got tripped up on; I assumed it was okay to further refine some points, considering I/we still owned the copyright on the original work.
  • The license is non-revokable. If someone chooses to do something with your work that you don't approve of (think hate or porn sites here) you're out of luck. Worse: they have to attribute you. You can't even request they use your work but not associate it with you. Stew on that for a while, it'll leave a bad taste in your mouth.

I wanted to keep things simple and open. To me, and to many of you, there are clear lines between fair use, respectable use, and outright theft. To others, there aren't, and this is why things will have to change.

I want people to be able to use these .css files. I want them to learn from them, I want them to take the techniques within and produce new work with them. That's how I and many others learned what we know, and I want to fully encourage it to continue.

But I don't want wholesale copying. Images or no, the designs are not templates. Zen Garden submitters are not spending their free time putting together work so that others may re-use and possibly profit from it. This, it would seem, is what some can't differentiate. There's no license in the world that says "use some, but not all," and even if there were, the definition of "some" is too vague to be legally binding.

So, because of the few, the many once again suffer. I now have to spend the next few weeks refining a more specific license for the Zen Garden. The spirit of openness and learning will be preserved as far as it can, but since it was obviously too open to begin with, it will be much more restrictive.

None of us is glad to see it come to this. I think Michael said it best in this comment:

The sad part in all of this is the legalistic vantage points that everyone has to come from nowadays.

It can't just be about what's proper, polite, courtious, considerate, correct, or nice.

It has to be about what the license says, what the offender feels entightled to, how far the offender feels he can go with it, what the letter of the law says can and can't happen.

It can't be a nice little "oh sorry", "hey, no problem" affair. It's gotta be "sorry bucko you lose, too bad" and "listen here you jerk". Maybe I'm just a little polly-anna but I expect better behavior out of people whether or not there's a license covering that behavior. It saddens me that there are people out there so immature and so self-absorbed that they just can't understand it when people don't want to give them the world for free and act belligerent when people challenge them for that behavior.

Permalink › | no comments

Copywrong

September 9

Perhaps I should wait until I cool down some... but by the time you're done reading this, chances are you'll be angry too.

Logic aside, if you offer work for others to learn from you run the risk of those same people you wish to help claiming the work as their own. I've had good luck so far with the Zen Garden, but that changed this week.

Since nearly the beginning, there have been people grabbing entire templates, graphics and all, and using them as the basis of their site; these I have politely requested the site owners to take down, and compliance has been prompt and agreeable.

The tricky part is when people want to use an individual .css file as a basis for their design, changing only images, perhaps with a new colour or font scheme. Obviously releasing the .css file under the Attribution-ShareAlike license from Creative Commons means that it's open to interpretation; I've tried to address this in the FAQ by adding a further layer of permissions on top of the CC license.

The idea is that you can feel free to steal bits and pieces, learn from, and generally plunder the CSS as long as you don't approach it as a template. The line is fuzzy and vague, and it's where most confusion springs from. I've been lenient. Most have been considerate, and e-mailed me or the designer before beginning; generally these people are granted permission, and the designer is flattered.

But this week I have been dealing with someone in particular who started off on the wrong foot by using the designs wholesale (graphics and all), and continues to grow more belligerent with each new e-mail. Allow me to quote him, sic:

I am not your ennemy... There is no copyright issues in seeing Mona Lisa In the Louvres museum, draw it at home, and make this drawing available on the internet, (once again, there will be no commercial issues with theses templates)...Can we talk to find an acceptable solution? Waiting for a proposal of yours Dave. Forget the threats, or I will definitively skip you off the process. You have no choice Dave, either I do it under your supuervision, or I do it alone.

And:

If you don't want this like that to happen again, then I am affraid that you should make your web site private access, and make the people pay per view... do you like the idea? Ever heard about the price oF success? Congratulation Dave, that's what is happening to Css Zen Garden

Where does this sense of entitlement come from? How can someone see something online, assume they have permission to use it, and argue against the person who created it that they have the right to continue doing so? This particular individual was planning on using some of the Zen Garden designs as templates for re-distribution with an open source content management system. Perhaps that says something.

Regardless, I think it's time to re-visit the Zen Garden's licensing. I have a responsibility to protect the work of 34 graphic artists (and that number grows weekly.) I need better tools at my disposal.

Permalink › | 74 comments

September 8

Interesting news item from Typograhi.ca: much as the BSA (Business Software Alliance) audits large companies in efforts to police their respective members' licensing policies, now large type foundries like Agfa Monotype, Linotype and more are beginning to crack down on illegal usage of their products.

Feedback on this news is, expectedly, largely negative. Nobody likes the heavy hand of enforcement, but fonts suffer the same false impression that software still does: they are easily copied, so they must be value-less.

Type foundries have a unique weapon on their side, as do any institutions producing creative materials with the intent of re-use by the purchaser (ie. stock photography & illustration providers)—the resulting work, by nature, is highly visible. In fact, it's unsuccessful if it's not. Lawsuits will no doubt abound from publicly-spotted work.

The question, then, is how does this apply to the limited set we take for granted daily? Web typography is restricted to the small subset of mostly cross-platform fonts that Microsoft bestowed upon us over half a decade ago: Arial, Courier, Georgia, Times New Roman, Trebuchet MS, and Verdana. (Of course, there are also Comic Sans MS, Impact, and Webdings if you're into that sort of thing)

It looks like the answer is that it doesn't. Their core web fonts package (which, incidentally is no longer available from Microsoft on the grounds that 'all the fonts within are included in Microsoft products anyway') shipped with licensing terms that are refreshingly generous. The FAQ states: Designers can specify the fonts within their Web pages.. The EULA states: You may install and use an unlimited number of copies of the SOFTWARE PRODUCT.

We can conclude that no restrictions governing use existed at the time of distribution. Whether this changes in subsequent releases is yet to be determined; it could happen. For now though, our core web fonts are still free for all to use.

Permalink › | 10 comments

Statistics

September 5

Wanted: a good, free site statistics utility.

I don't need much: hourly numbers for the current day, daily numbers for the current month. Past archives can just be a single number for the entire month, nothing more. Something simple, but accurate and reliable. It has to have a small footprint: a single GIF embedded in the page is preferred. Oh, I'll want referrers too. I like those.

HotStats died last week, and I miss my self-congratulatory vanity meter. I was coming close to a rather impressive milestone, too, damn the timing. Ah... lest you think I'm asking out of purely selfish reasons, the missing domain has also been hanging the Zen Garden. (I haven't gotten around to dropping the code from it yet)

I've long rejected SiteMeter, but I'm considering Re_Invigorate. Better options? And I'm running on IIS, so Dean Allen's Refer is out.

Permalink › | 28 comments

Review Day

September 4

Sitepoint Forums contest winners announced, under-the-hood changes recently made around here.

A Design Review Sitepoint Forums ran a CSS design contest recently, inspired by the Zen Garden . The entries are in , and some are great. I don't know what the official contest policy is on the original designers re-using these, but Patrick H. Lauke's example design Gothic found its way into the Zen Garden as Gothica so I'm inclined to think it'll be alright. That being said, Ray Henry with his fabulous reh3 and even more fabulous Deadend Prophecy had better get in touch with me after the results are in. That is, if, for some very very strange reason he doesn't sweep the contest. You may remember Ray's existing Zen Garden design Backyard . Beautiful work, all three. § A Technical Review With any luck, you haven't noticed: I've been doing a lot of tweaking under the hood around here, and I think it's about time to mention a few things. I'm now employing Pixy's absolutely brilliant CSS rollovers on the right. Where before there was a slight blink as the image loaded upon first mouseover, now there's a completely seamless transition. 5.0 browsers should handle this site a little better. IE5/Win was never much of a problem, but there are still a few tiny little quirks yet. IE5/Mac was a complete disaster, and thanks to some help from Alicia Lane (and a gentle nudge to actually do something about it), I've sanitized the majority of the blast radius. There are still problems , but at the very least it's somewhat usable now. Search is (finally) coming. I have to do some major work on the templates, but I've got the functionality started. Now to figure out where to put it. Cynthia Says thinks my front page passes all the major checkpoints for AAA. Bobby doesn't. I know there are still outstanding issues regardless, and a good read-through of WCAG is going to help me fix them, but I'm moving in the right direction. It should be noted that the errors reported by Bobby are questionable: Do not use the same link phrase more than once when the links point to different URLs. As Ken Walker has recently discovered and Jukka Korpela analyzed in further detail, it should be possible to provide a title for each link and satisfy this requirement. That doesn't work. Separate adjacent links with more than whitespace. Cynthia says I'm doing this. Bobby says I'm not. The rejected markup is my headers, which look like so: <h3 id="p000244"><a href="http://etc/000244.asp" title="perma-link for [this article]">Plugins & <object> - Illegal?</a></h3> If I'm doing something wrong here, I'm all ears. But I don't think it's me. Make sure event handlers do not require use of a mouse. Well this is just plain silly. There's not an iota of script on my front page. Cynthia gets them right. However, since those of us who spend the time doing it right like to make mention of it, Cynthia's inability to directly link validation results for any given site continues to be a problem. There's a very important point that should be made in here about not trusting software anyway since the guidelines need a human eye for complete validation, but I'm getting verbose as it is. After publishing a piece on bulletproof XHTML , I obviously had to start thinking about the issue myself. You might be pleased to note that most pages on this site, including the comments pages are validating. The current plan is to 'flip the switch' and trigger XML rendering mode in a few weeks, after I've had time to convert old content and make sure I've got my error-correcting working properly. There will be a follow-up piece on how I went about it at that time. Up until now I was marking up all my <acronym> s by hand. Thanks to Brad Choate's plugins and Mark Pilgrim's macros , I don't need to anymore. Interesting note about the use of abbreviations on a page: WCAG suggests providing a title for an <acronym> / <abbr> when it first occurs in the document. It doesn't say you can't do the same for other occurances, but it suggests you don't need to. No problem, that's the exact behaviour of the plugin. But it begs the question—even though I'm not applying titles to repetitions, do I need to markup the rest of the occurances with the proper tags? My gut feeling is that I do, but the plugin doesn't allow that. So I'm at an impasse here: More work for me, vs. proper semantics. Hmmm. And yes, I know the difference between <acronym> and <abbr> , and no I won't use the latter until IE supports it. End of question. If you've made it through all that, my final point is that this site still looks the same as it did last week. Ever get the feeling that sometimes you have to wear too many hats? § ]]>

Permalink › | 23 comments

A Second Voice: Bulletproof XHTML

September 3

Second Voice Icon: MarkupSelf-professed markup geek Evan Goer is a senior technical writer for Chordiant Software. He realized that web standards were important the day that Netscape 6 became available internally while working at Sun Microsystems, and he's been recovering from the shock ever since.

In 'Bulletproof XHTML', Evan explores the full meaning and implications of XHTML. Using University of Texas physics professor Jacques Distler's recent experiences with XHTML as a case study, Evan points out the quirks of the language, and how true forward-compatibility is a difficult proposition. This is the third edition of A Second Voice.

update: Well it looks like Evan wasn't the only one thinking about this recently. WaSP has posted a new article to their "WaSP asks the W3C" series which expands on Evan's coverage of MIME types and browser support. Make sure not to miss Serving XHTML with the Right MIME Type.

Permalink › | no comments

Markup: Bulletproof XHTML

September 3

Second Voice Icon: MarkupEvan Goer is a senior technical writer for Chordiant Software. He realized that web standards were important the day that Netscape 6 became available internally while working at Sun Microsystems, and he's been recovering from the shock ever since. Evan's main regret these days is that he has no sense of typography or graphic design whatsoever. Sooner or later he's going to take some classes, damnit.

The vast majority of today's "XHTML" websites are invalid. No, don't take my word for it — you can verify this for yourself by running the following experiment:

  1. Collect a random group of websites that declare an XHTML doctype.
  2. Run the home page of each site through the W3C validator.
  3. If the home page validates, validate at least three random secondary pages.
  4. Observe monstrously high failure rate. Lie down with cold compress on forehead. [Optional]

A naive observer might find these results surprising. After all, by definition all XML must be well-formed: this is what allows you to parse it with efficient off-the-shelf XML parsers, transform it with a standard transformation language, describe and manipulate its tree-like structure with a standard API, and much more. Malformed XML completely obviates these benefits. So why would anyone bother churning out pages upon pages of the stuff?

Why, indeed. Consider the humble browser parser (the browser component that is responsible for reading your markup). Modern browsers contain at least two parsers: one for XML and one or more for HTML. The lax HTML parser does its best to display pages no matter how mangled they are, while the strict XML parser chokes on the smallest error. Unlike its cousin, the XML parser is hard to trigger — you have to do something "special", such as serving up an unusual MIME-type. Since invalid sites rarely bother to trigger the XML parser, their pages are parsed as HTML rather than XML. Thus we are protected from the vast wasteland of invalid XHTML out there, and the proprietors of these invalid sites are none the wiser.

But what's so wrong with this picture, really? Let's take a look at a real world example of what happens when XHTML is treated as XML rather than tag soup.

Canary in the Coal Mine

Jacques Distler had a problem: he wanted to share equations on string theory and quantum field theory with his fellow physicists on the web. Although Jacques had never needed to pay much attention to markup languages before, he did know how to write equations in a language called TeX. TeX wasn't particularly web-friendly, but Jacques happened to have a tool that would convert TeX to a newfangled XML standard called MathML. MathML looked ideal for displaying equations on the web. How hard could it be to post a few equations?

Jacques discovered that the Mozilla browser could display MathML... if the MathML equations were embedded in a well-formed XML document. Fortunately, the W3C had thoughtfully provided an XML formulation of HTML called "XHTML 1.1" that allowed the embedding of inline MathML equations directly in web pages. Mozilla could display the equations if the page was valid "XHTML 1.1 plus MathML 2.0" and if the page triggered Mozilla's internal XML parser. So Jacques dutifully constructed a valid template, configured his server to serve up his pages to Mozilla with the recommended MIME-type for XHTML, and voilà — the equations displayed beautifully!

However, Jacques soon discovered that he was living on a knife's edge. Because he was using Mozilla's unforgiving XML parser, one little mistake — a mismatched tag, an unescaped entity — would choke his visitor's browser. And to his consternation, Jacques found that even if he wrote perfectly well-formed XHTML, other people were conspiring to mess up his web pages. By allowing comments, opening up trackbacks, and displaying snippets from alien RSS feeds, Jacques had opened up his site for any random visitor to crash that page with garbage markup. In order to produce 100% valid XHTML, Jacques realized that he had to "bulletproof" his site. Strip control characters. Validate comments. Batten the hatches. If he was going to take advantage of the power of XHTML, he would have to protect his site from his own mistakes and everyone else's.

The X in XHTML

At first glance, the conversion from HTML to XHTML seems straightforward enough. Just change a couple of lines at the top of the page, set everything to lowercase, quote your attributes, close your tags, fix a few nits here and there... and presto! Forward compatibility achieved. Right?

Well, not really. It's easy to do a one-off conversion to XHTML. The hard part is constructing a site so that it stays valid XML no matter what you or or anyone else throws at it. Jacques had to go through this process, and if you're going to actually use XHTML for anything, you will too.

As for whether XHTML is indeed critical for forward compatibility, that's still an open question. Let's take this at face value. Consider the following statement:

The future of the Internet is XHTML.

If that statement is true, then an unpleasant truth follows: if an XHTML site isn't bulletproof, then the site isn't forward compatible. Again, the reason XHTML is "more advanced" and hence "more forward compatible" than HTML is that all-important "X". XML's inherent strictness is the key that enables new functionality in XHTML. Thus, if you violate this strictness and serve up tag-soup XHTML, you've accomplished nothing — you haven't enabled your site to use any XHTML-specific features, either now or in the future. (Conversely, if the statement is false, then serving up tag-soup XHTML isn't a disaster. It's merely embarrassing.)

Note that when we speak of "forward compatibility", we must be careful not to conflate "clean semantic markup", "CSS layouts", and "good accessibility" with XHTML itself. You can meet (or fail) all these goals whether you use XHTML or plain old HTML 4.01. The key technical question is: will the web of the future require functionality that is not present in HTML?

Strange Days

These are strange days for markup geeks. The good news is that we've emerged from the Dark Ages of web development into a Renaissance of sorts. A profusion of browsers have bloomed with reasonably good standards support, and even the current "baseline" browser can handle mid-level CSS layouts and DOM-based JavaScript.

And then there's XHTML. Here we're stuck back in the Dark Ages. There are very few uniquely-XHTML applications available today, aside from a few edge cases like Jacques Distler. And let's face it, most of us aren't physicists. It's a rare website that really needs MathML.

This lack of interesting XHTML applications makes it frustratingly difficult hard to understand what XHTML is all about — what it can do, what it requires of us. In fact, the whole thing is reminiscent of the state of CSS in 1998. You could read the CSS2 spec, but it was hard to imagine something like Fahrner Image Replacement when the tools of the day didn't support basic CSS layouts in the first place. And even after the first tools became available (starting with IE5/Mac), it still took years for the community to come around to the idea that standards matter. Maintaining a 100% valid XHTML website requires a similar philosophical shift — no more cowboy hand-coding without validating every change, no more trusting alien content. Real XHTML is a whole new ballgame.

The lack of XHTML applications has a more insidious effect in that it raises the cost/benefit ratio for converting to XHTML. We can convert, but most of us won't be doing anything with it — the benefit is low. As for the cost, that can be surprisingly high. For example, what's the best way to deal with comments? Even if you manage to programmatically strip out all control characters and unescaped entities, you're still faced with a tough decision:

  • Disable comments entirely?
  • Disallow markup in comments?
  • Allow markup, but force all users to submit valid XHTML comments?

The first two options solve the problem easily, but they restrict your site's functionality. As for the third option, it simultaneously restricts and enhances your site's functionality. Your comment usability suffers, because you can't just dash off a comment and submit it in one easy step. However, your users can now respond in other flavors of XML (if you let them), such as MathML or SVG. If you have a highly specialized audience, this functionality could be critical.

In short, switching your site to XHTML is not a no-brainer, and the trade-offs and decisions only multiply when you get down into the details. Each designer must weigh these costs and benefits individually. There is no "right" answer.

So what about you, Mr. Author-of-this-article? A fair question. For me, the benefits are just too low. I don't have a strong technical reason to switch to XHTML, and so I'm sticking with the technology that meets my needs today: HTML 4.01 Strict. Don't get me wrong: I respect those brave souls out there, the trailblazers. I'm just not one of them. If I wait patiently, the "must-have" XHTML applications will arrive eventually (along with the toolsets to deploy them). This game's only just started.

References

Permalink › | 37 comments

JAWS Petition: Mea Culpa

September 1

Now the thing about having people you respect tell you when you're wrong is that it really sinks in. The recent JAWS petition has drawn criticism. Many have gone on record denouncing it, some more vociferously than others.

While the points against it have been made, you should not feel bad for signing it. Why? Because your hearts were in the right place, God bless you all.

The resulting dialogue has come up with this: Freedom Scientific, the creators of JAWS, offer a completely free demo version of the product that lasts for 40 minutes before requiring a system reboot. Wisdom of the moment says you have no need for a free or mostly so developer's edition because this demo version exists. It's good enough for you. (The question is begged: since Freedom Scientific makes zero profit either way, the effective difference is what? One way is 'legitimate', the other is not. Let's not get too tangled up in this, because both arguments have holes.)

We're all on the same side, and quibbles notwithstanding, we want the same thing. So in my journey of discovery over this past week, I continued re-phrasing the same question until I got an answer I was satisfied with. Kynn Bartlett provided me with what I needed to read, and I commend you all to give it your well-deserved attention.

Points made:

  • JAWS is a specialized application; if you do not commit to a solid week of running it without a monitor to fall-back on you can't expect to use it to test effectively. So even using the demo version isn't going to get you very far.
  • There are other screenreaders on the market; you shouldn't be catering to software, although it can't hurt to later run your work by someone who actually has a screenreader.
  • If you code to WCAG, you are creating accessible sites, software quirks aside.

Read Kynn's tutorial and be aware of the issues. But instead of spending a bit of time fooling around with an application you have little hope of being able to use properly, instead spend that time familiarizing yourself with WCAG, and brush up with the plethora of information that's available for free already.

update: also consult the JuicyStudio Assistive Device Chart for information on screen reader behaviour.

Permalink › | 10 comments