The new emerging NLE for GNU/Linux

This overarching topic is where the arrangement of our interface components meets considerations about Interaction Design. The interface programming allows us to react on events and trigger behaviour, and it allows us to arrange building blocks within a layout framework. And beyond that, there needs to be some kind of coherency in the way matters are arranged — this is the realm of conventions and guidelines. Yet in any more than trivial UI application, there is an intermediate and implicit level of understanding, where things just “happen to happen”, which can not fully be derived from first principles.

It is fine to have a convention to put the »OK« button right — but how do we get at trimming a clip? How are we assumed to get at trimming a clip? if we work with the mouse? or the keyboard? or with a pen? or with a hardware controller we don’t even know yet? We could deal with such on a case-by-case base (as the so called reasonable people do) or we could aim at an abstract intermediary space, with the ability to assimilate the practical situation yet to come.

interface has a spatial quality

the elements within an user interface are arranged in a way that parallels our experience when working in real world space. With the addition of some minor dose of “hyper”  — allowing for cross connections and shortcuts beyond spatial logic…

locality of work spaces

but the arrangement of the interface interactions is not amorphous, rather it is segregated into cohesive clusters of closely interrelated actions. We move between these clusters of activity the same way as we move between several clearly segregated rooms confined within a building.

context and focus of activity

most of what we could do in theory, is not relevant right now most of the time. Yet only when the inner logic of what we’re about to do coincides with the things at hand right now, only then we might feel enabled to perform our work.

shift of perspective

and while we work, the focus moves along.
Some things are closer, other things are remote and require us to move and re-orient and reshape our perspective, should we choose to turn towards them.

the ability to arrange what is relevant

all day long, we do the same stuff again and again, and this makes us observe and gradually understand matters. As we reveal the inner nature of what we’re doing, we desire to arrange close at hand what belongs together, and to expunge the superficial and distracting.

Foundation Concepts

The primary insight is that we build upon a spatial metaphor — and thus we start out by defining various kinds of locations. We express interactions as happening somewhere…

Work Site

a distinct, coherent place where some ongoing work is done
the Work Site might move along with the work, but we also may leave it temporarily to visit some other Work Site

the Spot

the Spot is where we currently are — taken both in the sense of a location and a spotlight. Thus a Spot is potentially at some Work Site, but it can be navigated to another one


the concrete realisation of the Spot within a given Control System

Control System

a practical technical realisation of an human-computer-interface, like keyboard input/navigation, mouse, pen, hardware controller, touch

Focus Goal

an order or instruction to bring something into focus, which also means to move the Spot to the designated location.

UI Frame

the overall interface is arranged into independent top-level segments of equal importance. Practically speaking, we may have multiple top-level windows residing on multiple desktops…


a set of concrete configuration parameters defining the contents and arrangement within one UI Frame. The Perspective defines which views are opened and arranged at what position and within which docking panel

Focus Path

concrete coordinates to reach a specific Work Site.
The Focus Path specifies the UI Frame (top-level window), the perspective, and then some canonical path to navigate down a component hierarchy in order to reach the anchor point of the new Work Site

the Spot Locator

navigating means to move the Spot Locator, in order to move the spot from Work Site to Work Site. The Spot Locator is relocated by loading a new Focus Path leading to another Work Site

The concept of a Focus Goal has several ramifications: For one it implies that there is something akin the “current Control System”, which also could be the currently active Control System(s). Simply because Focus, as the realisation of the abstract notion of the Spot, is always tied to a Control System able to implement it. And when we’re able to define generic location coordinates and then “move there”, with the help of the SpotLocator, we draw the conclusion that there must be a Focus (implementation), somehow getting shifted towards that location. Like e.g. the desired entity to gain the keyboard focus. And, beyond that, the second thing we may conclude is that there need to be some degree of leeway in the way such a Focus Goal can be reached. Since the inner logic of Control Systems can be quite drastically different from each other, we are well advised to leave it to the actual Control System how actually to fulfil the focus goal.

To point out an obvious example: it is not a good idea to move the mouse pointer forcibly onto a screen element. Rather, we must use the established mechanisms of switching, scrolling and unfolding to bring the desired target element into the visible area, leaving the last step to the user, which must actively move the mouse onto the target. And we must give good visual clues as to what happened, and what we expect from the user (namely to direct her attention onto the element brought into focus).

Building the framework

To create such a system is an ambitious goal for sure. We can not reach it in a single step, since it entails the formation of a whole intermediary layer, on top of the usual UI mechanics, yet below the concrete UI interactions. Especially, we’d need to clarify the meaning of Perspective, we need to decide on the relation of top level frame, individual view, layout, focus and current location within the UI. On a second thought, building such a system implies we’ll have to live with an intermediary state of evolution, where parts of the new framework are already in place without interfering with common conventional usage of the interface as-is.

UI coordinates

Especially the focus navigation entails the use of some kind of ubiquitous coordinate system within the user interface. In fact this is more of a topological navigation, since these coordinates describe the decisions and forks taken on navigation down the Focus Path.

  • [optional] top-level Window (UI frame)

  • [optional] Perspective

  • Panel

  • View-ID

  • [optional] Group or Tab

  • local path: component.component.component