Clearly, Google wants its Explorers to start exploring as fast as possible, because after ordering Glass on Friday afternoon, I’ve come back Monday evening to find it waiting for me.
I haven’t yet had a chance to play with it, though the purpose of this particular post is not to discuss my first impressions or initial interactions with the device. Actually, I want to put those off for now, and discuss Google’s design philosophy as it relates to Glass, and how that could translate (or not) into today’s oil & gas fields. I want to do this now, before I handle the device, and then see how these thoughts evolve over time as I interact with and hopefully write some code for it.
Over the weekend, I downloaded the Glass Development Kit (GDK) as an add-on to the Android tools I already have, and took a look at some included example source code, to get a feel for how to program on Glass.
Unfortunately, there really isn’t much out there (as I insinuated in my last post on this particular topic) but I did find this really interesting video, showcasing a presentation made by a Google employee at last year’s South By Southwest conference. To the extent that Google wants this to catch on, it makes sense that the company would want to influence the “best practices” as they relate to software design as soon as possible. I’m really glad I found this video, since it’s an extremely helpful guide in leading me to WHAT to code, which is in Glass’s case even more important than HOW to code. In fact, it was this video which lead me to pose the question I bring up in this post’s title.
As Google sees it, there are four “pillars” in the Glass Design Philosophy (GDP). If there’s one overarching message I took away from that video on using Glass, it’s that this technology should be “there when you want it, and out of the way when you don’t”, and each of these four pillars flows from that idea.
Adapt your applications to Glass
Glass use hinges on what Google calls “timeline cards”. You can think of these as digital index cards that are sent to the user’s Glass eyepiece, and putting information on a timeline card should follow the same adage as you did when you made flashcards to study for exams: “less is more”.
The speaker gives a great example, stating that while you wouldn’t want to read full newspaper articles on Glass, you COULD receive a timeline card with the day’s headlines. I submit that you could then speak those in which you are interested, and Glass could perhaps communicate with your tablet or smartphone to bring up the full piece.
The biggest challenge I see as I embark on this personal project to explore how Glass (and wearable technology generally) can be adapted to oil & gas applications is that the oil & gas industry’s data needs are growing larger and larger (the result of more challenging wells and reservoirs) while displays continue to shrink.
The key to adoption of wearable technology in the oil & gas industry lies in how those two diverging trends are reconciled.
It is really, really crucial that the oil & gas industry not see Glass and its future iterations as a way to PUSH more and more data onto its field specialists and engineers, but rather a means that these professionals can more easily PULL this data in a way that frees them up to work more safely and efficiently.
This is critical: I fear that if this crucial first pillar of Google’s GDP isn’t adapted to wearable technology’s first forays into the oilfield, the technology could be rejected by an overwhelmed workforce that would rather just stick with the ways they’re used to.
Google Glass should be unobtrusive
This translates to the user having maximum control over settings, and a minimum number of automated push notifications.
Content needs to be kept timely
The speaker in the video implies that mobile devices are each suited to different time frames: while mobile phones and tablets are great for bringing up what happened today, or even last week, Glass is for “right here, right now”.
In the field, if anything is pushed to an engineer or field specialist, it should concern only the matter they are dealing with at that very moment.
Avoid the unexpected
A great point made in the video is that any bad user interface or experienced is actually magnified on Glass because it is permanently ON you. I’m getting an eye-twitch/migraine just thinking about what constant pop-ups just above my field of vision would do while I was trying to put a tool together or walk the rig crew through a setting procedure!
Summary thoughts for now
Traditionally, the oil & gas industry has focused much more on function than form, as it should: all that matters it that the job gets done safely. In the particular case of wearable technology, I do think more attention is going to have to be paid to user interface. It needs to be pared down, it needs to be slick, and it needs to be inviting in addition to delivering to personnel the specialized content they need to do their jobs.
While it’s always easy to work without constraints, the very real physical constraints posed by Google Glass present a challenge to software designers who ultimately will not just have to code for this device and others like it, but understand the problem they are tackling so well that they can distill it down to a tiny digital index card.
Next time: some hands-on impressions!