Object-Oriented UX, Part 1

Think Company
11 min readFeb 22, 2019

By Keren (Veisblatt) Toledano | Originally posted on the Think Blog.

Read part 2 of this series here.

In the physical world, like in a drugstore or a supermarket, I have been trained to expect certain wayfinding points.

Look up for aisle signs — they serve as broad menu items, mimicking that of a universal navigation within an interface. There are lane numbers with broad categorizations. But, I am also trained to find objects by looking for other, related objects. The vitamins are near the pharmacy because both are loosely “health related.” The diapers, family planning merchandise, and feminine hygiene products are proximal. The milk is near the cheese, almost always in the back of a store. The fresh produce is on display near the entrance. When I find pears, I know apples are close-by.

This is content discovery through the content itself. Sure, there are small discrepancies between categorization and location from store to store, but overall, we’ve miraculously agreed to classify and group objects used in our shared, retail world. That’s because locally we agree how to group certain shared traits — e.g. the need for refrigeration, snacking things, and fresh wheat products. I have been in unfamiliar markets, and shopping can be less enjoyable for me in those places — more disorienting and time-consuming. The source of confusion is probably as much cultural as it is contextual: “Well of course the Q-tips are in the makeup aisle instead of by the first aid chemicals.”

Those aisle markers depend on shared understandings and context. In a supermarket, we use them to seek out broad categories, then specific and niche items. Or, when it’s our first time visiting a supermarket, we use these aisle markers as initial orientation. Upon return, we likely won’t use them again. But, they also tend to be employed when we’re lost. We use aisle markers when the associative relationships between foods no longer makes natural sense to us in the physical world. When we can’t find something.

In the physical world, we expect systems. Systems that create standards which help ameliorate confusion and lay out simple processes for people to follow. Systems that breed familiarity. Systems that establish relationships between things. Not just hierarchical, linear, parent-child relationships, but heterarchical, associative ones. Systems that don’t need a specific order to be efficient. To be successful in the physical world, systems rely on the categories that address the similarities, and quantifiable differences, between objects — shape, size, color, name, etc. — and then, based on context, give priority to those things. Only then can we begin to associate.

VIRTUAL VS. REALITY

In the physical world, it is much harder to hide poor inventory management and messiness. Even subconsciously, when I walk into a clothing store, if the socks aren’t near the stockings, I get miffed. Poor organization! When I see unfolded fabric strewn about, price tag mismatches, dirt on the mannequins, and a cash register that reads, “Out of Service,” I begin to wonder about the quality of the company’s products. If a sign for the dressing room leads me in the wrong direction, I get downright bothered. Mislabelled routes! Within the physical world, objects are visibly stacked and present — non-virtual — so shoddy organization is more apparent and always ready for immediate inspection.

In the digital world, we can conceal a multitude of sins. Bad data. Partially loaded components. Disruptive interstitials. Deprecated API calls. Dead-end links. Competing local, universal, or global searches. Outdated visuals and mismatched typography. Duplicate pages with redundant information. Broken images. Page disharmony. Customers wondering, “Am I in the right place?”​ Content not appearing “where it’s supposed to be.”

Our tolerance can be high for substandard digital experiences — the virtual equivalent of piles of unsorted button-down shirts on the floor of a boutique. We have allowed it, in part, because it’s not as distracting. Not as “real” or solid. It’s easy to get away with these transgressions because they’re not all experienced by the same customer at the same time, nor are they viewable in an immediate, analog way. Not in the same way a spill on aisle seven would be. People don’t consider themselves experts of interfaces in quite the same way a person knows when something is dirty, disorganized, or mislabeled at a store. Users give us the benefit of the doubt. They often assume that they’re wrong. That they have failed or somehow clicked “the wrong thing.”

We get away with a lot of this poor organization because we treat navigation as an escape hatch. A “get-out-of-jail-free” card that can reset the experience, and bring the user back to the baseline. And yet, more and more, we deprioritize this navigation by hiding the menu within a hamburger or drawer. In fact, many apps get away with no top-level navigation whatsoever. So, why does a tree-like, linear navigation remain the standard within web interfaces?

ENTER OBJECT-ORIENTED USER EXPERIENCE

OOUX is the process of planning a system of interacting objects. It helps us organize a navigation that is circular and contextual by defining associations between things. Spiderwebs over trees. Heterarchies instead of hierarchies. A heterarchy is a system of organization where the elements are unranked, therefore they can possess the potential to be ranked an infinite number of different ways, based on context. It’s an inherently flexible structure that is made of smaller, independent units.

The framework is a form of information organization that asks us to define the “things” (often nouns) that make up our system, prior to representing them online. Prior to even wireframing. It begs us to think about the relationships between objects. It requires us to uncover our customers’ corporeal world and to expose their mental model. The “mental model” reveals how information is mapped in users’ minds and how something works in their real world. For example, do our users believe that bananas are more similar to pineapples than to plantains? Why is that? OOUX asks us to question the uniqueness of a physical object, and the repeatability of a digital thing. It places a premium on “things” before task flows. We think about all aspects of a “thing” before we can assign a digital-world action to it (edit, manage, pay, compare, search, etc.).

Of course, developers have been working this way for years. They define a system as a collection of related object types (“classes” of data outlining fields, relationships, and hierarchy), then attach possible actions (methods) to said objects. Finally, they weave relationships between objects (message passing, events, and event handlers). All of these high-level rules control behaviors, which are then abstracted into our designs. The nomenclature is different, as are the methods. But the ethos of the two frameworks, object-oriented programming and object-oriented user experience, are similar. Both tout resourcing and understanding before interface representation. Both approaches place a premium on modularity and reusability. Through OOUX, a designer or researcher can better communicate with a developer. That’s because the artifacts created during the process are documentation of relationship maps, names of modules, and definitions of an object. This allows the developer to better see our intent, and to better understand the product we’re building. They won’t have to abstract and work backwards from flat visuals.

Let’s break that down into more accessible examples. We often design “verb” first without realizing it. This is akin designing a task flow or a pure user story. We posit, “a user needs to pay a bill” or “a user needs to edit their contact information.” “Pay” and “edit” would be the verbs. And they are the focus of these tasks. This often means designing a clear, prioritized path. The “blue sky” route. The “happy path.” The problems with this way of thinking are myriad. We deprioritize the experience when a user “goes off path.” We assume that tasks are linear, when in fact the world is associative. We forget about the bill itself. Many things can be done to a bill. We can “pay it,” “dispute it,” “compare it,” “download it,” and so forth. We miss opportunities for natural actions.

By designing one flow at a time without having mapped out the entire product offering at the beginning of the project, we can easily overlook certain opportunities or places to seed content. Then, as new features get introduced to the product, the designers need to shoehorn things into existing interactions and screens. By designing around “objects” rather than pages, we can break down barriers that we needlessly put in front of our users in their search for information. What must “contact information” contain? What must a “bill” contain? Are all elements within those things unique, or are they needed in many parts of the experience?

PEOPLE DON’T THINK IN ACTIONS​

In OOUX, we are more interested in the objects in the system. What are all of their properties? Do all objects share some of the same properties? Do some objects have unique content, distinct from others? How many objects should the system have? Is the object part of something larger — like how a sock is both a single item, and a part of an outfit? A sock is also made of fabric, which is an object. Is the yarn the smallest object we want to tackle? It is up to us to define the demarcations. In what ways can we connect one object to another? How is a sock connected to another sock? How is it connected to shoes?

We cannot help people navigate to something unless we fully understand the elements of the thing itself. This framework meets people where they are. People buy physical products for their physical offices. People buy physical clothing for their physical bodies. People buy physical food for their physical stomachs. They can receive physical bills. They can pay with physical money. The physical world is associative. Humans have evolved to interact with the physical world. Our understanding of screen-based environments is nascent. Humans think about the world, and their needs, in terms of real-world objects. Data, and relationships between objects, are abstractions. They are important, but they are not as concrete and easily understandable.

Imagine if you owned a car, a motorcycle, a dog, and a home. Add to that scenario the fact that you must provide for yourself and a small infant. These are the objects in your life. If people were to think action first, they would have to say something like, “Today I want to refuel.” And then, based on that action, they would be presented with a menu of objects which can be refueled: the car, the motorcycle. “Today I want to brush.” Menu of objects: dog, baby’s hair, my hair. “Today I want to wash.” Menu of objects: motorcycle, home’s windows or siding, car, my hair, my baby’s hair, my dog.

This method of organization creates redundancies in your objects. It’s a slow system that requires you to recall any object you may own which may fit the need for an action. The dog can be washed, brushed, petted, and fed. The motorcycle can be washed, refueled, ridden, and sold. Within the navigation, the dog has to appear organized in several places, hidden beyond the action.

But, if I were to shift my frame of thinking to move from actions to objects, the organization and connective fascia between the objects in my life would be more easily understood. I have a baby. My baby has teeth. My baby has hair. My baby has skin. My baby has hunger. Because my baby has teeth, hair, and skin, those elements need to be washed. Because my child has hunger, it needs to be fed. I only need to reference the child once to view all of its potential actions. To see all of its content.

I have a motorcycle. It has a gas tank. The tank needs to be refueled. The motorcycle has a tires, hubcaps, and a chassis. Those objects need washing. I can clean those objects individually or as a group. In this framework, I first select the object I desire to affect, then I select the action I hope will affect it.

Because there are so many objects in my life, and all of them are important to me, it’s hard to force a hierarchy. My world is constantly shifting contexts. It would be hard to decide which of these objects in my life has highest priority. When my child is hungry, that takes first priority. When my dog needs to go outside, that is also now first priority. If my car’s windshield is cracked, fixing it is now my first priority. Each of those would need to be “first pole position” in my navigation at some point. But that’s not possible in a fixed, linear hierarchy. Object-oriented UX helps us to allow for this kind of situational prioritization. It offers flexibility in organization. It assumes that our relationship with objects are not static, and helps us to better suggest next steps to our users.

THE PLUG FOR PART TWO

Of course digital systems have many more objects, which are often more complicated than the aforementioned examples. Hundreds of thousands of bits of data that need to be interconnected. But that’s where the value of this method shines, and where the OOUX magic begins! OOUX is based on reusable modules that are rife for iteration. A single change in one part of the system automatically ripples through the rest. A non-hierarchical navigation helps our users get to their content through other bits of familiar content. There are no “wrong paths.” It’s also easier for us to add features, as these would only be additional objects and not frankensteined task flows, addended sitemaps, or edited single pages.

Now that you’ve (hopefully) been convinced (or at least intrigued)by the possibilities of an object-oriented user experience, we’ll discuss how to build this system of design in part 2 of this series. We’ll also outline how this framework increases design value, business value, and user value.

THIS WILL INCLUDE HOW TO:

  • Extract objects: What’s an object anyway? How do we know? How do we find and define them? Where does the user come into play? What activities can we perform that allow customers to tell us their preferred objects?
  • Define content within the objects: Main objects contain core content, calls to action, metadata, and other nested objects (a piece of core content, but also an object). Our version of OOUX also includes optional permissions (identity management and access control) and states.
  • “Nest” and cross-link objects: Watch your associative system come to life! Infinite paths to finite objects.
  • Add actions/verbs to each object
  • Design in object modules
  • Document objects for development
  • and more!

__

We stand on the shoulders of giants and I would be remiss not to include these learning resources and places of inspiration that informed my method of OOUX:

Move on to part 2 of this series here.

Read more about design on the Think Blog.

--

--

Think Company

An award-winning experience design and development consultancy that works in highly-regulated and complex markets, building world-class digital experiences.