Further investigation of Drupal 7

My object in trying out Drupal 7 is pretty straightforward. I would like to have a platform for creating web sites at absolutely the minimal cost and within those sites give clients the greatest opportunity to create and curate their own content.

My test case is to build the most lightweight system imaginable for promoting events. It doesn’t bother me too much if there are some fields that seem rather specialised, because an event is a specialised kind of content, but the intention is to accommodate a wide range of events from, say, one-offs like a music recital or a two-day conference to repeating items like a theatre production. The events all need to be presented in the same way. By that I mean that it should be possible to fill out the information required in the same way whatever the event. We know that some events will have things like a speaker or a producer and others might have a musical ensemble, but it’s not a great idea if there are separate boxes for all those things (composer, speaker, director…) because it can get very tedious for the user adding the information to find the right one and to be confident it is correct.

So far

I haven’t yet found outcroppings of the stick-little-bits-of-php-in-a-database approach I remember from Drupal 4 (or was it 5?) back in 2006. I am not sad.

It is possible to use Drupal’s GUI to set up customisations of data and structure, but to modify the display of such information seems to require work on templates for the overall page and/or the subcomponents within pages. It is probably possible to use stylesheets to make quite considerable changes to the appearance of every page on a site, but for finer grain or to show and hide information within only certain areas will require template php files.

User-generated content lives in Nodes or Comments; clearly Comments are for comments so realistically a Node is the atomic unit for all user-generated content.

You can have as many specialised Nodes as you wish, and specify subsidiary content fields that are unique to one Node type or shared between different types of Node across an installation. I don’t really know whether Sites within an installation would share the specialised Nodes and have no insight into the workings of multiple Sites within one installation.

You can’t put a Node inside a Node, so you can’t have subsidiary content (for example, you cannot make a person Node and attach it as a child or property of an event Node).

You can use taxonomy to take often-repeated information (say, the room names in a building) and make it available as easy-to-insert fields attached to Nodes. Taxonomy is implemented as any number of Vocabularies containing Terms, and the Terms can have any number of fields attached to them. The obvious field is the set of words that make up the term, but there is also a description field by default and you can add more. But it doesn’t feel like Terms are a reasonable place to put things that will differ every time, like people’s names. Instead, Terms feel like proper nouns for concepts. And creating a Term in a Vocabulary has to be done before a Node that will need to reference the Term is created.

Next steps

There are probably hacks around these restrictions but they’ll negate the benefit of using a regularised, strightjacketed system such as Drupal is. Despite that, it certainly feels as if a little more prodding is justified because I can see that by accepting a compromise or two in the absolute beauty of the data structures I might have a reliable system in a matter of hours rather than days or weeks.

And did I mention that you can write tests for your own Drupal module code? That surely merits a look sooner rather than later.

Published on 19/03/2011 at 22:50 by Technophile, tags , ,

Can't log in to Ubuntu 10.04 on a 2006 Shuttle XPC?

This is just on the offchance that somebody does what I’ve just done and upgraded from a desktop Ubuntu 9.04 installation on a Shuttle XPC (this one is a 2006 AMD64 model with motherboard graphics only) that has a 1600 x 1200dpi monitor attached.

After the upgrade you may find you cannot log in using the standard GNOME graphical interface. When you do so, the monitor blanks for a few seconds and then you are returned to the login screen. But at the bottom of the screen is a small menu allowing you to choose whether to log in with regular GNOME, an xterm, or failsafe GNOME. Try failsafe GNOME, and if you can log in successfully use system > preferences > monitors to reduce the screen resolution, then try the regular GNOME login again.

Update I think that I have an i386 install of Ubuntu (not entirely sure why). That might explain why the video driver, which I believe is OpenChrome, won’t load. The integrated chipset in the machine (Shuttle SK21G) is entirely geared to the AMD Sempron, a 64-bit chip. Falling back to the failsafe video driver gets around the problem, but this is obviously far from a good solution :-(

Published on 16/02/2011 at 14:37 by Technophile, tags , , , ,

A first impression of Drupal 7

I’ve been hearing from coworkers that Drupal 7 is a leap ahead in several respects, one being its user interface. I’ve used Drupal only once before, back at version 5, and found that the combination of a complex UI and entirely unfamiliar terminology was quite daunting. Although I also learned that it had a very friendly and active community, that wasn’t enough to make me stick around.

I’ve just stuck a Drupal 7 install onto my machine and I am truly struck by the clarity that better typography and a less cluttered interface design can bring. It does however make the very idiosyncratic terminology and approach stand out in sharp relief. I hope the other changes have made it easier to pick up than it was a few years back; it is very brave to offer so much flexibility through an in-browser interface aimed at non-specialists.

Edit: added link (duh). Drupal is somewhere between a website-in-a-box and a full coding framework, by the way.

Published on 11/02/2011 at 22:07 by Technophile, tags , ,

Thoughts about The Design of Understanding conference

It was a real privilege to have Max Gadney curate the Design of Understanding conference which took place yesterday at St Bride Library. I think we were very lucky to catch him at the right moment: well-informed, well-connected and a clear thinker, at the point we invited him he happened to be mid-leap from the BBC to private enterprise.

At that point he also had some recent design teaching experience fresh in his mind. This experience, as he explained at the time, makes him feel that design students aren’t getting exposure to the kinds of design work that are perhaps the most significant at the moment and that they don’t get a chance to work with the data that characteristically lies behind such work. They don’t have a chance to gain an insight into the business processes or bureaucracy from which the data comes, or to learn the importance of understanding those things to some degree before attempting to pass the information on to others in a polished visual form.

This is also my reading of the short piece he wrote for Eye 78 (on page 100) which was published serendipitously at the start of the week. Eye 78 is a treasure. It focuses on information design and it offers a pleasing set of contributions which I think student readers will find helpful in understanding some of the ingredients. To have it appear at the start of the week brought an extra dash of anticipation to the conference itself.

I was very pleased with the outcome of the day. Treating the event very much as if a test of his new design curriculum, Max chose speakers who work to a greater or lesser extent outside the realm of visual design. It was good to have people who could talk intelligently about visualisation of data without the superfluous embellishments of design orthodoxy. Whatever the qualities of the finished work, they were interested in the success of that work as a whole rather than as an aesthetic expression alone. It was also good to hear authoritative statements about how contemporary exhibition designers are concerned with the communicative capacity of what they create rather than with the media in which they communicate.

The talks showed that our speakers believe that rather than parcelling up information from other people and presenting it in a novel way, the purpose of the design work is to increase the chances of engagement and understanding for the users of the end product. Designers working in such a way can expect to have an influence on their collaborators because they show that they understand the nature and purpose of the information with which they work.

Postscript

This conference’s tenor was set by dissatisfaction with the way information design is approached, and a belief that this can be fixed by changing the kind of work that students of design carry out. There were 29 concessionary tickets (for students or those over 60) in our audience of around 150. As an event organiser for the Friends of St Bride Library I have argued with little opposition for the cost of concessionary tickets to remain at a generously discounted price. We believe the standard of our events is high and opportunities for students to enhance their understanding of the field, their knowledge of the wider design community and their contacts with practitioners are considerable.

With the last year’s events almost all sellouts, and with its overall budget severely constrained, St Bride Foundation will have to think carefully about future event pricing. I would urge students and other concessionary ticket buyers to share their experiences and opinions about our conferences so that we can make a well-informed decision. Email: events@stbride.org.

Published on 29/01/2011 at 16:47 by Technophile, tags , , ,

Reverting to Type in Hoxton

A crowd of top-notch letterpress practitioners, designers and students will be showing their work at an exhibition in Hoxton that opens in early December. I recommend a visit; with the work of twenty or so very disparate groups and individuals there should be something for everyone type-inclined.

At Standpoint Gallery, 45 Coronet St, Hoxton, London N1 6HD.

Open Daily, 10am-6pm; dates 10–24 Dec 2010 and 4–22 January 2011

Published on 10/11/2010 at 09:11 by Technophile, tags , , , , ,

Excel favours its own comma-separated value files

I found just now that Excel 2008 for Mac treats CSV files it thinks that it created in a different and better way. Specifically it handles double-quoted strings (containing commas) properly rather than ignoring the quotes and adding a field separator where you don’t want one. So if your scripting task includes creating a CSV file, it might be worth having an empty Excel-generated CSV file withinin your codebase, using the OS to copy that file to a new target, and then writing your data into that. I have not looked to see whether this behaviour is maintained if you work with the file on a different operating system; I suspect it wouldn’t hold up.

Published on 13/09/2010 at 11:45 by Technophile, tags , , , ,

Web fonts as commodity

TypeCon 2010 is providing an opportunity for the webfonts enthusiasts in the font software industry to make some noise. At the keynote by Roger Black, (liveblogged by Dave Crossland), he raised the possibility – the challenge – of the ‘¢99 webfont’. This, he said, means individuals will soon all have their signature font, enabled through the online communications and document-sharing systems they use, and it won’t be Comic Sans but a web font. At the other end of the spectrum Font Bureau, in a consortium along with the selfsame Roger Black and others, have started webtype.com (note: not ‘web font’ – that’s significant). This is a subscription service that provides font families that have been optimised for screen display. They’re not fonts designed for the screen but re-hinted, re-shaped versions of what might be termed ‘regular’ typefaces. They really do embody expertise in the optimisation of glyph outlines for reading on-screen, a contentious and misunderstood problem as Black acknowledged, and this puts them ahead of the offering of the more established land-grabbers TypeKit as far as visual designers are concerned.

The strategy seems very sensible, and Roger Black’s allusion to the total commoditisation of fonts online points to the scenario that has been latent ever since the first designer cried because they couldn’t use the typeface they wanted in a site design. It’s taken a long time for the font software industry to drag itself here, but the day has now arrived when people who publish fonts will differentiate themselves on the rendering merits of, and generate regular revenue, from web fonts. There is a lot of money in those hills.

Published on 20/08/2010 at 09:56 by Technophile, tags , , ,

Untangling RVM and Bundler, for the novice

I was introduced to the combination of RVM and Bundler while working with my friend James on a project earlier in the year. It’s a nice combination of Ruby tools which offers developers a way to keep tabs on the Ruby gems they are using in a project and to avoid conflicts, or the accidental use of other versions of those gems, when they are actively developing or deploying their work. Typically these projects might be Ruby on Rails-based web applications but although ‘Rails’ is often accidentally substituted for ‘Ruby’ there is no reason why Bundler + RVM should be any less useful outside of Rails.

I found the post he pointed out, by Mike Lindsaar, very helpful to keep me enthused and get me trying things out but I also found the concept was still hard to grasp. Although both Bundler and RVM are adequately documented, somehow their example-driven documentation doesn’t quite make things clear at first/second reading. So here’s a quick synopsis which I’ve written to help my own understanding.

RVM

RVM (Ruby Version Manager) is a tool that creates named Ruby builds and named sets of gems. It lets you swap between these builds and sets at will.

Use RVM to build a named Ruby, then use RVM commands to modify the environment in your current shell so it will use that build instead of the system default. Likewise, once you have created a named gemset with RVM, you can then use RVM commands to bring the gems in that gemset into use in your shell. They will be made available to whichever Ruby is currently accessible using the information in your environment.

An .rvmrc file lets RVM automatically switch the Ruby build and gemset without human intervention.

Bundler

Bundler is a gem that installs and uninstalls other gems following a list you maintain.

The list centralises the information about which gems are needed to allow a given project’s Ruby code to run. It tidies up some loose ends and hard-to-manage conflicts by allowing you to specify exactly which gem version is needed to run the code, so that if there’s a feature that is missing from earlier gem versions or removed from later ones you can keep using the version you need even when it is obsolete.

The Gemfile, the list that Bundler works with, is plaintext and therefore version-controllable. It sits naturally at the root of the project.

Putting them together

RVM is great on its own, because as you swap between projects in daily work on your own machine you can swap between different Ruby versions and different sets of gems instantly. You are protected from updates to a gem on the system breaking things in your projects and you can retrospectively introduce an RVM gemset to sort things out if that does happen. That should make it really useful for anyone developing gems or dealing with legacy code issues.

The Bundler gem is required by Rails 3 itself [‘baked in’ as I am sure its developers would say, with tongues in their cheeks], and I guess that’s because it was the Rails use-case itself that caused Bundler to be written. So either tool is useful alone. The point of Mike’s post is that by coupling the two together you get a really nice combination. Here’s what you can do.

  1. RVM to create a named Ruby build, if desired

  2. RVM to create a named gemset (probably named for the project)

  3. .rvmrc file in project root to ensure that when code is executed in the project’s directory tree it’s run with the right Ruby and gemset

  4. Gemfile in project root to specify gems required for the codebase (remember this is version-controllable)

  5. Bundler to read the list of gems and install them; the gems are installed within the environment that’s active, so they’ll be installed within the RVM gemset for the project and not elsewhere

Of course to benefit from all this on your own machine with multiple projects is one thing; RVM will not seem nearly as necessary if the project code is deployed to a dedicated server with no other web applications running on it. But not all deployments look like that.

I think I have this all right; comments welcome of course.

Published on 11/07/2010 at 13:28 by Technophile, tags , , , ,

The Design of Understanding

I’m getting excited about a conference in planning for January 2011 at St Bride Library in London.

It’s part of St Bride Library’s series of one-day events, intended to be both approachable and affordable. This particular conference is The Design of Understanding, and it will be curated by Max Gadney.

A glance at the speakers lined up so far gives a good idea of the quality we can expect from the day’s talks. If you’re into user experience, graphics, information design and data, and can make it to London for 28 January, be sure to be there.

Published on 06/07/2010 at 21:35 by Technophile, tags , , ,

Powered by Publify – Thème Frédéric de Villamil | Photo Glenn